Real-Time Traffic Sign Detection

using Hierarchical Distance Matching

by

Craig Northway

A thesis submitted to the

School of Information Technology and Electrical Engineering The University of Queensland

for the degree of

BACHELOR OF ENGINEERING

October 2002

ii

Statement of originality
I declare that the work presented in the thesis is, to the best of my knowledge and belief, original and my own work, except as acknowledged in the text, and that the material has not been submitted, either in whole or in part, for a degree at this or any other university.

Craig Northway

iii

iv .

Vivien. thanks. for the use of her laptop. The excellent background Vaughan Clarkson’s course. ELEC3600.Acknowledgments There are many people who deserve acknowledgment for the help I have received while working on this thesis and during my 16 years of study. excellent view of the big picture and promotion of my work. now its handed in you can contact me again. Jesse and Jon. I’d like to thank all of my friends and family. Sarah Adsett and my parents Bruce and Rosalie Northway for their support. To my ”non-engineering” friends Michael. particularly my girlfriend and best friend. gave me in this area was invaluable. “Get Naked for SEES 2002!” v . Scott. I’ll also single out Toby. Shane Goodwin deserves mention for his help as the lab supervisor organising cameras and facilities. Unfortunately its impossible to mention them all! From an academic perspective I must thank my supervisor Brian Lovell for his technical help when I was struggling. To all the engineers: Hope you enjoyed your degree as much as I have! Special thanks goes to Nia. Leon and the rest of the SEES exec. To Jenna and Ben Appleton and Simon Long for their signals related technical insights.

vi .

This thesis will develop a real-time traffic sign detection algorithm using hierarchical distance matching. Due to the visual nature of existing infrastructure. Refactoring of the systems design would also allow for larger hierarchies to be created and searched.Abstract Smart Cars that avoid pedestrians. A prototype Matching system also built in MATLAB 3. Examples of other uses for the matching algorithm. signs and line markings. The hierarchy creation system is based on simple graph theory and creates small (< 50) hierarchies of templates. The prototype matching system uses static images and was designed to explore the matching algorithm. A real-time application in Visual C++ using the DirectShow SDK and IPL Image Processing and OpenCV libraries. vii . Matching of up to 20 frames per second using a 30+ leaf hierarchy was achieved in the real-time with a few false matches. A hierarchy creation system built in MATLAB 2. 4. increasing the robustness and applications of the algorithm. There are four deliverables for the thesis: 1. Future work on this thesis would include the development of a final verification stage to eliminate the false matches. and remind you of the speed limits? Vehicles will soon have the ability to warn drivers of pending situations or automatically take evasive action. Other matching examples demonstrated include “Letter” matching. rotational and scale invariant matching. image processing will play a large part in these systems.

viii

Contents

Statement of originality

iii

Acknowledgments

v

Abstract

vii

1 Introduction

1

2 Topic 2.1 2.2 Extensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Deliverables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

3 4 4

3 Assumptions

5

4 Specification 4.1 4.2 4.3 4.4 MATLAB Hierarchy Creation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . MATLAB Matching Prototype . . . . . . . . . . . . . . . . . . . . . . . . . . . . Real-Time Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Smart Vehicle System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7 7 8 8 8

ix

x

CONTENTS

5 Literature Review 5.1 5.2 Historical Work: Chamfer Matching . . . . . . . . . . . . . . . . . . . . . . . . . Current Matching Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.2.1 5.2.2 5.3 Current Hierarchical Distance Matching Applications . . . . . . . . . . .

11 11 12 12 14 15 16 17 17

Other Possible Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Hierarchies and Trees 5.3.1 5.3.2 5.3.3

Graph Theoretic Approach . . . . . . . . . . . . . . . . . . . . . . . . . . Nearest Neighbour . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Colour Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6 Theory 6.1 6.2 Chamfer Matching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Feature Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.2.1 6.3 6.4 Edge detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19 19 20 20 22 23 25 26 26 27 28 28

Distance Transform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Distance Matching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.4.1 6.4.2 6.4.3 6.4.4 Reverse Matching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Oriented Edge Matching . . . . . . . . . . . . . . . . . . . . . . . . . . . . Coarse/Fine Search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Hierarchy Search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6.5

Tree/Hierarchy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.5.1 Graph Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . .3 MATLAB Prototype Matching . . . . Hierarchy Optimisation . . . . . . . . Final Implementation . . . . . . . . . . . . . . 37 37 38 40 41 42 43 44 45 46 46 47 48 49 52 55 55 Group Creation . . . . . . . . . Directional Matching . . . . . . . . . . . . .3. . . .2 6. . . . . . . . . . . . . . .5 8. . . . . . . . . . . . . . . . . . . . . . . . .6 Trees . . . . Multi-Level Hierarchy . .6 Finding groups . . . . . . . . . . . . . . . . . . . . . . . . . . Hierarchy Creation . .m . . . . . . . . . . . 8. .5 8. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2.2 8. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2. . . . . . . Pyramid Search . .4 8. . . . . . . . . .createtemps. . . . . . . 8. . . . . . . . . . . . . . 8. . . . .CONTENTS xi 6. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 8. . . . . . . . . .m . . . . . . . . . . . . . . . . . 35 36 8 Software Design and Implementation 8.3 8. . .4 8. . . . . . . . . . . . . . . . .2. . . . . . . . . . . . . . . . . .2 8. . . . . . .2 Image Acquisition . . . . . . . . . . . . . . . . . . . . . Final Implementation . . . . . . . .3. . . . . . . . . . . . . . . . . . . . . . . .3 8. . . . .3. . . . . . . .6 Basic System . .1 Camera . . . .2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 33 Programming . . . . . . . . . . . . .setup. . . .3. Score Calculation . . . . . . 7 Hardware Design and Implementation 7. . . . . . .3. . . . . . . . . . . . . . . . .5. . Masking Reverse Search . . . 8. . . . . . . . . . . . . . . . . . .2. . . . Rejected Refinements . . . . . . . . . .1 Hierarchy Creation . .1 8. . . . . . . . . . .2. . .3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1. . . . . . .1 8.

. . . . . . . . . . . . . . . . . . . . 9. . . . . . . . Further Information . . . . .6 8. . . . . . . . . . . . . . . . . . . . .1 8. . . . . . . . . . . . . . . 9. .1 9. . . . . . . .1 Hierarchy Creation . . . . .4 Real-Time . . . . . . . . . . . . . . . . .5 8.3 9. . .3. . . . . . . . . . . . . . . . . . . . . . . 9. . . . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . .4. . . . . . . . . . . . . . . . .5 Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Strengths/Weaknesses . . . . . . . . . . Letter Matching . . . . . . . . . . . . . .1 9. . . . . . . . . . . . .2 Skills Learnt . . . . . . . . . 63 63 63 65 65 65 66 66 67 67 67 68 68 68 Matlab Matching . . . . . .1 Matlab Matching Results . . . . . . 9. . . . . . .4. . . . . . . . . . . . . . . . . Actual Design .xii CONTENTS 8. . . . . . . . .3 Real-Time Matching . . . . . . . . . . . . .1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Final Matching Algorithm Used. . . . . . . . . . . 9.3. . . . . . . . . . . . . . . .4. . . . . . . . . . . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2. . . . . . . . . . . . . . . . .2 Hierarchies .1 9. . . . . Object Oriented Design . . . .3 8. . . . . . . . .4. . . . . . . . . . . . .7 Matching Process . . . . . . . . . . .3. .4 My Performance . . .2 9. . . . .4 8. Further Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4 9. . . . . . . . . . .3. . . 56 56 56 58 59 59 60 60 9 Results 9. . . .4. . . . . . . . . 8.4. Results . . . . . . . . . . Size Variant Matching . Enhancements/Refinements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3. . . . . . 9. . . .2 8. . . . . . . Rotational Matching . . . . . .

. . .2 Programming . . . . A. . . . . 10. . . . . .1.3 Position . . . . . . . . . . . . . . . . . . . . . . . . . . A. . .2 Temporal Information . . . . . . . . .CONTENTS xiii 10 Future Development 10. . . . . . . . .1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 73 A A. . . 10. .1 Assumptions . .1. . . . . . . . . . . . . . . . . . . . . . .1. . A. . . . . . . . . . .1 Video Footage . . . . . . . . . . . . . . . . . . . . . . . .1. . . A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1. . . . . . . . . . . . . . . . . . . . . . A. . . . . . . . . . . . . . A. . . A. . . A. . . . . . . . . . . . . . . . . . . . . . . . .1 Australia’s Innovators Of The Future . . . . . . . . . 10. . . . . . . . . . . .6 Size . 69 69 70 70 70 70 70 11 Conclusions 71 12 Publication 12. . . . . . . . . . . . . A. . . . . . . . .2 Lighting . . . . . . . . . . . .3 Better OO Design . . . . . . . . .5 Damage . . . . . . . . . . . . . . . . . . . . .8 Objects . . 10. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5 Optimisation . . .4 Angle . . . . . . . . . . . . . . . . . . . . 10. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 Speed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1.4 Improved Hierarchy Generation . . . . . . . . . . . . . . .7 Computer Vision Functions . . . . . . . . .6 Final Verification Stage . . . . . . . . . . . . . . . . . . . . . . . . . . 79 79 79 79 79 80 80 80 81 81 82 . . . . . . . . . . . . . . . . . . .1. . . .

. 82 82 82 82 84 87 87 87 89 89 91 91 92 95 95 95 96 97 98 A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A. . .7. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 A. . . A. . . . . . . . . . . . . . . . . . . . . . .1 Diamond Signs . . . . . . . . . . . . . . . . . . . .3 Sub-Sampling . . . . . . . . . . . . . . . . . . . . . . A. . . . . . . . . A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5. . . . . . . . . . . . . . .1 Distance Transform . . . A. . . . . . . .7. . . . . . . . . . . . . . . . . . . . . A. .2. . .4. . . . . . . . . . 111 . . . . . . . . . . . A. . . .3 mytree . . . . . .4 Open CV . . . . . . .2. . . . 102 A. . . . . . . . . . .xiv CONTENTS A. . . .4 Template Format .3 IPL Image Processing Library . . . . . . . . . . . . . . . . . . . . . . A. . . . . . . . . . . . . . . . .2 Deallocation . . .2 Direct Show . . . . .2. . . . . . . . . . . . . . . . . . . . . . .4 Prototype Orientation Code . . . . . . . . . . . . . . A. . . . . . . .1 Orientated Edge Transform . . . . . . . . .2 Different Feature Extractions .9. . . . . . . . . . . . . . A. . . . . . . . . . . A. . . . .9 Hierarchy Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6 UML of Real-Time System .9. . . . . . . . A.2 Orientation Map . . .1 Localised Tresholding . . . . . . . . . . . . . . . . . . . . . . . . . . . A. . . . . . . . . . . . . . . . . . . . . . . .7 Code Details of Real-Time System . . . . . . . . . . .7. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2 Circular Signs Scores . . . .2. . . . . . . . . . . .5 Rejected Prototype Implementations . A. .5. . . . . . . . .3 Extra Hierarchy Implementation Flowcharts . . . A. . . . . . . . A. . . . . . . . . . . . . . A. . . . . .4. . . . . . .8 Tested Enhancements/Refinements to Real-Time System . . . . . . . .5. . A. . . .1 MATLAB . .7. . . . . . . . . . . . .

. .2 Compilation . . . 112 A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .10 Matlab Matching Results . . . . . . . . . . . . . . . . . . . .12. . . . . . . . . . . . . . . . . . . . 115 A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 Listing . . . 115 A. . . . . . . . . . . . . . . . . . . .11 CD . . . . . . . . . 115 A. . . . . . . . . . . . .12 Code . . . . . . . .12. . . 117 .CONTENTS xv A. . .

xvi CONTENTS .

.4 6. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Template Distance Transform . . . . . Distance Image . . . . . . . . . . . . . . . . . . . . . . . . .2 Binary Target Hierarchy [1] . . . . 1 3. . . . . . . . . . . . . 6 4. . Template . . . . . . . Search Expansion . . . . .1 System Output . . . . . . . . . . . . . . . . . . . . . . . . .List of Figures 1. . . . . . . . . . . . . . .2 6. . . . . . 9 5. . . . . . . . . 21 23 23 24 24 24 25 25 27 xvii . . 3-4 Distance Transform (not divided by 3) . . . . . . . .1 6. . . . .1 Real Time Traffic Sign System . . . . . . . . . . . . . . Image Clustering with Graph Theory [2] . . . . . . . . . . . . . . . . . . . . . . . . . . . . .8 6. . . . . . . . . . . .3 6. . . . . Matching Techniques [4] . . . . . . Overlaying of Edge Image [3] . . . . . . . . . . . . . . . . . .9 Canny Edge Detection Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 Likely Sign Position . . . . . . . . . . . . . . . . . . . . . . . . .7 6. . . . . . . . 14 16 6. . . . . . . . . . . . . . . . . Original Image . . . . . . . . . . . . . . . . . . . . . . .1 5. . . . .6 6. . . . . . . . . . . .5 6. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . My Chamfer Transform . .5 8. . . . . . . . . . . . . Group Creation Block Diagram . . . . . 6. . . . . . . . . . . . . . . . . . . . . . . . . . . .14 My Size Variant Hierarchy . . . . . . . . . . . . . . . . . . .4 8. . . . . . . . Reverse Matching Mask . . . . . . . . . . . . . . . . . . 8. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 66 67 . . . . . . . . . . . . .3 Circular Sign Hierarchy . . . . . . .6 8. . . . . . . . . . . . . . .m Flowchart . . . . . . . 8. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2 8. . . . . . . .1 9. . . . . . . . . . . . . . . .12 Orientation Map . . . . . . . . . . . . . . . . . . . . . .11 Tree . . . . . . . . . . . . . . .10 Simple Graph [2] . . . . . . . . . . . . Image Acquisition Flowchart .1 Block Diagram from GraphEdit . . . . . . . . . . Noise Behind Sign . . . . . . . . . . . . . . . . . . . . . . 8. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 30 32 7. . . . . . . . . . . . . . . . . . Simple Matching System . . . . . . . . . .3 8. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 8. . . .2 9. . . . . . . . . . . . . . . . . . . . . . combinegroups. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 Sign 60 Sign . . . . . . . . 6. . .8 8. . . . . . . . . . . . . . . . . . . . .9 Hierarchy Creation Block Diagram . . . . . . . . . . . . . 8. . . . . . . . . . . . . . . . . . . . . . . . .10 Pyramid Search . . . . . . .13 Intended Class Diagram . . . . . . . 9. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 8. . . . 38 39 40 41 44 45 47 48 49 50 52 53 57 61 8.12 Breadth First Search . . . . . . . .11 Oriented Edge Detection . . . . . . . . . . . . . . . . .7 8. . . . . . . . . . . . . . .xviii LIST OF FIGURES 6. . . . . . . . . . . . . . . . . . . . . Hierarchy Creation Flowchart . . . . . . . . . . . . . .

. . . . . . . . . . A. . . . . . . . . . . . . . . . . .15 Optimised Scores . . . . . . . . . . . . . . . .5 Simple Image . .14 Original Scores . . . . . . . . . . . . . 105 A. . . . . . . . . . . . . . . . . . . . . . . . . . . . .21 Fourth Group Template . . . . . . . . . . . . .22 Fifth Group . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 Multi-Resolution Hierarchy [4] . . . . . . . . . . . . . . . . . .10 Spiral Search Pattern . . 104 A. . . . . . . . . . . . . . . . . . . . .19 Third Group. . . . . . . . . . . . .17 First Group Template . . . . . . . .8 Actual Class Diagram . . . . 104 A. . . . . . . . . . . . . . . . . . . . . . . . 102 A. . . . 80 84 85 86 89 90 92 93 94 98 98 A. . A. . . A. . . . . . . . . . . . . . . . . . . . . . 105 A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . template = self . . . . . . . . . 104 A. . . . . . A. . . . . . . . .18 Second Group. . . A. . . . . . . . . . . . . . . . 106 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 A. . . . . . . . . . . . A. . . . . . . . .16 First Group . . . . . . . . . . . . . . . . . template = self . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2 findbestnotin. . . . . . . .4 remove. . . . . . . . . . . . A. . . . . . . . . . . . . . . . . . . . . . . . . .7 Intended Sequence Diagram . . . . A. . . . . . . . . . . . .m Flowchart . . . . . .12 Untruncated Distance Transform . . . . . . . . . . .20 Fourth Group . .6 Localised Thesholding . . . A. . . . . . . 105 A. .LIST OF FIGURES xix A. . . . . . .m Flowchart . .m Flowchart .13 Truncated Distance Transform . . . . 103 A. . . . . . . . . . . . . . . . . . . .3 anneal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 A. . .9 Actual Sequence Diagram . . . . . . . . . . . . .11 Straight Search Pattern . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . .33 Second Template Group Combinational Template . . . . . . . 106 A. . .31 First Template Group Combinational Template . . . . . . . 114 A. . . 111 A. . . . . . . . . . . . . . . . . . . . . . . . 108 A. . . . . . . . . . . . . . . . . . . . . . . . 109 A. . . . . . . . . . . . . 107 A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 A. . . . . . . . . . 112 A. . . . . . . . . . . . . . . 113 A. . . . . . . . . 107 A. . . . . . . . . 112 A. . . . . . . . . . . . . 110 A. . . . . . . . . . . . . . . . . .40 Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 A. . . . . 107 A. . . . . . . . . . . . . . . . . . . . . . . . . .35 Last Template Group Combinational Template . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .xx LIST OF FIGURES A. . . . . . . . .24 Sixth Group . . . . . . . . . . . . . . . . . . . .37 Original Image . . .29 Eight Group Template . . . . . . . . . . . . . . . . .36 Second Level Optimisation . . . . . . . . . . . . . . . 114 A. .39 Distance Transform Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .32 Second Template Group . . . . . . . . . . . . . . . . . . . 109 A. . 110 A. . . . . . . . .25 Sixth Group Template . . . . . . . . .38 Oriented Edge Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .34 Last Template Group . . . . . . . . . . . . . . . . . .28 Eigth Group . . . . . . . . . . . . . . . . . . . . . . . 106 A. . . . . . .41 Closer View of scores . . . . . 114 . .30 First Template Group . . . . . . . . . . . .26 Seventh Group . . . . . . . . . . .23 Fifth Group Template . . . 110 A. . . . . . .27 Seventh Group Template . . . . . . . . . . . . .42 Match . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . 115 A. . . . . . . . 54 A. . . . . . . . . . .List of Tables 8. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 xxi . . . . . . . . . . .1 Hierarchy . . . . . . . . . . . . . . . . . . . . . . . . . . .3 Real-Time . . . . . . 116 A. .2 MATLAB Matching . . . . . . . . . . . . . .1 Directional Scoring . . . . . . .

xxii LIST OF TABLES .

the system would help keep the driver aware of the traffic situation. The system will be useful for autonomous vehicles and smart cars.Chapter 1 Introduction This thesis will develop a real-time traffic sign matching application.1: System Output 1 . Figure 1. A Traffic Sign Recognition system has the potential to reduce the road toll. After testing the matching on traffic signs. By “highlighting” signs and recording signs that have been past. There also exists the possibility for computer control of vehicles and prompting for pedestrians and hazardous road situations. other implementations shall further demonstrate the effectiveness of the algorithm.

car models. vehicle identification and mobile robots. Mercedes-Benz’ Freightliner’s Lane Departure System uses a video camera to monitor lane changes alerting the driver to lane changes without the use of indicators. If the goals of the project can be met the developed application (C++) and associated utilities (MATLABT M ) will form a general solution for hierarchy creation and implementation. Daimler Chrysler have produced a prototype autonomous vehicle capable of handling many and varied traffic scenarios. tools.2 CHAPTER 1. This system projects an image of the road with obstacles highlighted onto the windscreen. The European Union are heavily sponsoring research into this technology through a smart vehicle initiative with a view to decreasing the road toll. cyclists. military targets. This thesis will help establish a working knowledge of such systems and demonstrate the simplicity of algorithm development on a PC platform. Some systems already developed by vehicle manufacturers include a night vision system which Cadillac have introduced into their “Deville” vehicles. text of known font. possibly due to driver error or fatigue. It uses a vision system to detect pedestrians and traffic signs. known local landmarks. . motorcyclists. Hierarchical Distance Matching could be applied to a range of other object detection problems. INTRODUCTION If reliable smart vehicle systems can be established on PC platforms upgrading and producing cars as smart vehicles would be cheap and practical. Examples of these include pedestrians. These recognition cases could be used in applications such as autonomous vehicles. etc.

It involves repeated simple operations (such as addition. Another factor contributing to the speed of the algorithm is the coarse/fine and hierarchical nature allowing significant speed-ups (without sacrificing accuracy) when compared to exhaustive matching. 7]. 9]. 5] are superior to other approaches for implementation on a general purpose PC platform. Gavrila’s success defined the topic and prompted further research into hierarchies and distance matching. can be successful in real-time using a hierarchy of images. such as traffic signs. Other methods used for traffic sign detection have included colour detection[6. 5] that this pyramid style search will not miss a match. simulated annealing[10] and neural networks [11].Chapter 2 Topic From the background research into shape based object recognition it was obvious that Gavrila’s Chamfer methods [4. using Hierarchical Distance (Chamfer) Matching. This thesis intends to prove the hypothesis that multiple object detection. multiplication) on the data set which is efficient when computed in this manner. colour then shape [8. 3 . None of these have been able to produce an accurate real-time system. Thus video footage can be searched for N objects simultaneously without the extensive calculations necessary for an exhaustive search in real-time on a general purpose platform. The topic for this thesis is Real-time Traffic Sign Matching. It can be mathematical shown [4. Gavrila’s success is due to the simplicity of his algorithm and its suitability to standard computation and the SIMD instructions.

2. car models (from outline/badge). e.2 Deliverables Based on these goals a MATLABT M based Hierarchy creation implementation. . The development of this algorithm will allow hierarchies other than the initially intended Traffic Sign’s to be used. alpha-numeric characters.g. the independent development and implementation of Hierarchical Chamfer Matching (HCM) and the evaluation of HCM as a method for object detection.1 Extensions The extensions to previous work [4. 5] and thus the original contribution presented in this thesis will be the automated hierarchy creation. TOPIC 2. a prototype static matching implementation in MATLABT M and a real-time HCM Object Detection implementation will be delivered.4 CHAPTER 2. hand gestures. pedestrians.

• Due to the size invariance of the method the sign should pass through the specific size(s) without being obscured. Many of these assumptions are for the specific task of traffic sign detection. • Signs should be positioned consistently in the footage.Chapter 3 Assumptions Before commencement of this project some of the assumptions were identified. • Lighting must be such that the camera can produce a reasonable image. • The Computer Vision functions should operate as specified. • Objects being detected must be similar in shape for the hierarchy to be effective. Further Details of these are in Appendix A. These assumptions must be reasonable for the thesis to be successful.1 5 . The following assumptions exist: • Camera should be of a high enough quality to resolve signs at speed. • Angle of the signs in relation to the car’s positions should not be extreme. • Signs should not be damaged.

6 CHAPTER 3.1: Likely Sign Position . ASSUMPTIONS Figure 3.

1 MATLAB Hierarchy Creation The hierarchy creation system should be able to synthesise an image hierarchy without user input into the classification.Chapter 4 Specification The goals of this project are: 1. Demonstrate the algorithm on other matching problems. 4. The following specifies clearly the input and output of each deliverable. Program an implementation of this object detection in C++ using the Single-Instruction Multiple Data (SIMD) instructions created for the Intel range of processors. This implementation will be the problem of traffic sign recognition. A prototype matching system for static images also built in MATLAB 3. A brief specification of what a Smart Vehicle System may do is included. This system should work on image databases of reasonable. 7 . This system will initially be based in MATLAB. Establish an automated method of hierarchy creation that can be generalised to any database of objects with similar features. 2. 4.

4 Smart Vehicle System A smart vehicle system for driver aid would be a self contained unit. 4. The image processing operations will be performed by the IPL Image Processing and Open CV Libraries. 4. It will be written in Visual C++ based upon the DirectShow streaming media architecture. Output A hierarchy of images and combinational templates.2). 4.3 Real-Time Implementation This Real-Time Implementation should match objects at over 5 frames per second in reasonable circumstances.2 MATLAB Matching Prototype This system should match traffic signs/objects on still images accurately. Input Image hierarchy and video stream. shown in the block diagram (Figure 4. Output Video Stream overlayed with matches. developed using the EZRGB24 example (from Microsoft DirectShow SDK A. SPECIFICATION Input A directory of images (which share similarity). It is not required to meet any time constraints Input An image hierarchy. Output The image overlayed with matches.1). It . and a threshold for the similarity. and image to be matched.8 CHAPTER 4. This unit would attach to the car either at manufacture or by “retro-fitting”.

4. the system may be able to control the car. The system must be careful not to lure the driver into a false sense of security. This would avoid possible collisions and keep within the speed limits. With the use of radar and other visual clues. if the car is in a 100km zone. People should be wary of the systems ability. The system would recognise all common warning and speed signs (real-time detection Figure 4. eg. or a heads-up display (output block). particularly in extreme situations. such as storms. snow. allowing the driver to check their speed between signs. The system may have higher intelligence allowing it to tailor the hierarchy or matching chances to the situation.. etc.. SMART VEHICLE SYSTEM 9 would provide the driver with details via either verbal comments. . It would be able to keep track of the current speed limit.4.1: Real Time Traffic Sign System block). a 30km speed sign would be unlikely.

10 CHAPTER 4. SPECIFICATION .

That is minimising the generalised distance between two sets of edge points. It was initially an algorithm only suitable to fine-matching. all relevant to the project. Barrow et al. Basic works on trees and graph theory were examined briefly. These papers form the basis of current matching techniques. The algorithm was investigated again throughout the mid 90’s when implementations on specific hardware became practical. Secondly.Chapter 5 Literature Review The review of background material for this thesis will cover several topics.G. let alone real-time video. justifying the choice of Hierarchical Based Chamfer Matching. along with several mathematical texts to understand the concepts. 5. an authority on Distance Matching and Transforms. Current research into image classification and grouping for search and retrieval is therefore also applicable to this topic. This is well before HCMA systems would have been practical for fast static matching. research into state of the art traffic sign and shape based object detection applications is reviewed. The first major work on chamfer matching was the 1977 paper “Parametric Correspondence and chamfer matching: Two new techniques for image matching” by H. Firstly several historically significant papers are reviewed. This work discussed the general concept of chamfer matching.1 Historical Work: Chamfer Matching Background work on HCMA (Hierarchical Chamfer Matching Algorithm) was started in the late 70’s. The topic was revisited in the late 80’s by Gunilla Borgefors [3]. 11 .

2. Thus the HCMA is an excellent tool for edge matching. Considerable work done on Hausdorff matching by Huttenlocher and Rucklidge in [13. This solved the major problem of the first proposal. LITERATURE REVIEW Borgefors [3] extended this early work to present the idea of using a coarse/fine resolution search. The systems for both traffic signs and pedestrian detection are based on . The mid-nineties saw several uses of distance transforms as matching algorithms. This was on static images. assuming Moore’s Law holds. This algorithm used a distance transform proposed by its author. This idea was later presented in [12]. Their best results required at least 20 seconds to compute on a binary image of 360x240 pixels. requires the image to be overlayed over each template to score each match. 5. Borgefors also proposed the use of the technique for aerial image registration. this is a computationally expensive operation. The paper demonstrated the algorithms object detecting effectiveness on images of tools on a plain background. as with all distance matching techniques. This was in 1993. Hausdorff matching. The conclusions reached were that the results were “good. then in 2002 with only hardware improvements.” [3]. even surprisingly good. The most successful work presented in this area is from Dairu Gavrila and associates at Daimler Chrysler using HCMA.2 Current Matching Algorithms Current approaches to shape based real-time object detection. it should be possible in well under half a second. its limitation to fine matching.12 CHAPTER 5.1 Current Hierarchical Distance Matching Applications Daimler-Chrysler Autonomous Vehicle Hierarchical Chamfer Matching (HCM) is currently being used in automated vehicle systems at Daimler-Chrysler. 14] showed that the Hausdorff distance could be used as a matching metric between two edge images. Orientated Pixel Matching and Neural Networks. the 3-4 DT. as long as it is used for matching task with in capability(sic). These include Hierarchical Chamfer Matching. 5. Tools can be recognised based solely on the outline in this situation hence are perfect for HCM.

This creates a binary search tree (Figure 5. pedestrian outlines. The hierarchy creation in this system is not fully automated. Their method of creating the hierarchy automatically uses a “bottom-up approach and applies a “K-means”like algorithm at each level” where K is the desired partition size. The hierarchy creation presented is a simple approach that may present good results. CURRENT MATCHING ALGORITHMS 13 HCM. This approach may be unique. scale and rotation invariant template based matching for a deformable contour i. The optimisation is done with simulated annealing. Chamfer measures. There is no mention of the real-time performance of the oriented edge pixel algorithm. These efficient algorithms are coarse-fine searches and multiple template hierarchies.2. The most surprising result of this work is the success of rigid. . rotated and scaled views are incorporated into a hierarchy. The overall technique proposed by Diamler-Chrysler shows excellent results and is worthy of further development. They go onto prove that distance transforms provide a smoother similarity measure than correlations which “enables the use of various efficient search algorithms to lock onto the correct solution”. A new template for each set of pairs is generated and the clustering continued until all templates belong to a single hierarchy. Target Recognition An Automatic Target recognition system developed by Olson and Huttenlocher [1] uses a hierarchical search tree. Worst case measurements of matching templates are then considered to determine minimum thresholds that “assure [the algorithm]. Translated. The disadvantage of their early approaches was in the one-level tree created. As suggested in these papers this matching technique is similar to Hausdorff distance methods. Their experiments show that the traffic sign detection could be run at 10-15 HZ and the pedestrian detection at 1-5 Hz on a dual processor 450MHz Pentium system. though not employed in the matching (Hausdorff matching) are used to cluster the edge maps into groups of two. The clustering is achieved by trying various combinations of templates from a random starting point and minimising the maximum distance between the templates in a group and their chosen prototype. labelling each edge pixel with a direction.will not miss a solution” in their hierarchies of resolution/template.5.1). It may not be quick enough for traffic sign recognition. They have designed algorithms using the SIMD instruction sets provided by Intel for their MMX architecture. Their matching technique employs “oriented edge” pixels.e.

.2 Other Possible Techniques Hausdorff Matching Many researchers have considered Hausdorff matching [13.1: Binary Target Hierarchy [1] Planar Image Mosaicing Hierarchical Chamfer Matching has been used successfully for Planar-Image Mosaicing. LITERATURE REVIEW Figure 5.2. It is a similar algorithm to chamfer matching. They chose one image to be the “distance” image and another to be the “polygon” image. except the distance measure cannot be pre-processed.14 CHAPTER 5. 15] for object detection. It is a valid approach for this application and will be considered as a possible matching strategy. Dhanaraks and Covaisaruch [12] used HCMA to “find the best matching position from edge feature (sic) in multi resolution pyramid”. 14. The interesting concept used in this work was the thresholding for taking a match to the next level. If the score was less then the rejection value (max − (max × percent 100 )) the pixel was expanded to more positions in the next pyramid level for matching. This is an interesting thresholding concept based on the maximum values rather than absolute. The resolution pyramids are built and matching is carried out by translating the polygon image across the distance image. 5.

8. The overhead of detecting the colour (even with a look up table) and the varying illumination made it difficult in this real-time scenario. Hierarchies and trees have also been investigated . 7. 2.3. Due to most colour representation schemes not being perceptively true. My previous results have shown that on compressed video this red circle is destroyed by artifacts. 9. 7. Signs such as those indicating speed limits have a thin red circle surrounding the details. 10. It is an excellent technique for situations where the colours are constant and illumination can be controlled. It was impossible to determine if this circle was red or brown. Image classification techniques have been examined in multimedia retrieval systems [20. 5. 19. it is difficult to define exact colours for matching. though it may still be a useful procedure for masking areas of interest. 26] into real time object identification. Colour detection Colour data has been used for matching in scenarios such as face detection [18. I do not have the necessary background knowledge to explore this properly. 9]. Further research was therefore carried out into tree structures and image grouping and classification. HIERARCHIES AND TREES 15 Neural Networks Work by Daniel Rogahn [11] and papers such as [16] are example of neural network techniques for traffic sign recognition. Previous work by myself on traffic sign recognition has attempted to incorporate colour data into the matching process. Yellow diamond warning signs were quiet easily detectable as present in an image. 22]. 11. It may be more effective in streamed uncompressed video. Some traffic sign detection algorithms use it as a cue [6.3 Hierarchies and Trees The results shown by [4. Their features were not perceptible accurately from colour data alone. 13. 8. 21. 5] have been far superior to other research [6. By including it’s colour in the detection. many unrelated areas of ground and trees were also highlighted.5. 19]. Thus identification of signs was not plausible from colour detection alone.

This approach sounds similar to that used in [4. S will contain N 2 + 1 images in the worst case. 5.” A graph is constructed of this set (each template is represented as a node) with the edges representing the distance measures between each template (Figure 5.2). Connected clusters that include the original query image are then found.1 Graph Theoretic Approach Selim Askoy [2] used distance measures to obtain similarities between the images. For each of those N matches we do a query and get back the best N matches again. LITERATURE REVIEW Figure 5. The clustering algorithm used is presented in the paper. The technique demonstrated in [2] was considered a simple and effective starting point for hierarchy creation in this thesis. 24]. but are also similar to each other. This has application to object recognition hierarchies.3. Define S as the set containing the original query image and the images that are retrieved as the results of the above queries. The algorithm they proposed considers retrieving groups of images which not only match the template. 5] in assuring that each grouping was the closest. The measure of inter-cluster similarity is established to determine which cluster should be returned. They “query the database and get back the best N matches.16 CHAPTER 5. The hierarchy creation was then looked upon as a “graph clustering problem”. . With the increasing electronic availability of large amounts of multimedia material high speed retrieval systems (such as trees) have been the subject of significant research.2: Image Clustering with Graph Theory [2] [23.

with a potentially useful clustering technique. This technique proved effective in the paper. or similarity measure threshold) is reached. HIERARCHIES AND TREES 17 5. . “Two clusters are picked such that their similarity measure is the largest” these then form a new cluster. Their method of construction also allows trees to be created with uneven distances to leaves. The similarity of all the clusters is then computed again.5.3. of clusters. This continues until a bounding parameter (no. Various methods such as linear regression and boolean features can be used for this. but was too complicated to pursue in an undergraduate thesis on image matching. The N images are placed into distinct clusters using their similarity measure. 5. yet each leaf cluster could contain more than two images. They select a representative image of the cluster rather than compute a new composite image (first proposed in [23]). with the added complexity of many leaves.3 Colour Information To group images [20] uses colour information. after the tree has been created a cluster centre is established. providing a short and certain path to them. This might provide a speed-up in matching where some images in the hierarchy are relatively unique.3. reducing the number of unmerged clusters. To represent each cluster.3.2 Nearest Neighbour Huang et al [23] used trees established by the nearest neighbour algorithm and built using normalised cuts (partitions of a weighted graph that minimise the dissociation with other groups and maximises the association within the group) in a recursive nature. This creates a tree that can at most have two clusters branching off a parent cluster. This allows the simplicity of a binary tree.

18 CHAPTER 5. LITERATURE REVIEW .

Score the template at “all” locations 19 . This is followed by the details of the graph theory and hierarchy basics necessary to understand and develop this work. Image processing is a relatively dynamic field where many problems are yet to have optimal solutions. Some elements of the HCM algorithm use this type of method. but there are still many basic theories and methods that are accepted as the “way” of doing things. 6. but those relating to the hierarchical search are relatively new to the image processing field.1 Chamfer Matching The basic idea of Chamfer (or Distance) Matching is to measure the distance between the features of an image and a template. Feature Extraction 2. The steps required are: 1.Chapter 6 Theory The theory behind this thesis is split into 3 main sections. Lastly some programming libraries that may not be familiar to all electrical engineers are mentioned. If this distance measure is below or above a certain threshold it signifies a match. Relevant image processing theories and techniques are explained first. Distance Transform 3.

Using a first order gradient approximation changes in intensity will be highlighted.2 Feature Extraction Shape based objection recognition starts with feature extraction representations of images.20 CHAPTER 6. Ideally all edges of objects and changes in colour should be represented by a single line.1 Edge detection The goal of edge detection is to produce a “line drawing”. . The generalised form of edge detection is gradient approximation and thresholding. Standard edge and corner detection algorithms such as Sobel filtering and Canny edge detection can be applied to colour/gray images to generate binary feature maps. and areas of constant intensity will be ignored. Determine whether the scores indicate a match In Gavrila’s Hierarchical Chamfer Matching Algorithm (HCMA). before going onto explain the theory of Gavrila’s HCMA. There are algorithms that vary from simple to complex. those that use first order derivatives and second order derivatives.2. THEORY 4. Of the edge detectors that use gradient approximation there are two types. To find changes in intensity we need to examine the difference between adjacent points. The boundary of an object is generally a change in image intensity. a hierarchical approach can be used. These features are usually corners and edges. The following section describes the theory behind each of the steps in simple Distance Matching. Images can be grouped into a tree and represented by prototype templates that combine their similar features. 6. By matching with prototypes first a significant speed-up can be observed compared to an exhaustive search for each template. distance matching is applied to the scenario of matching multiple objects. When trying to match a set of images with sufficient similarity. 6.

It was formulated with 3 objectives: 1. This requires getting a first derivative normal to the edge. Calculating this normal is usually considered too difficult and the actual implementation of the edge detection is as follows in figure 6. Good localisation with minimal distance between detected and true edge position 3. It is used in a wide range of applications with successful results. The third aim relates to locating single edge points in response to a change in brightness.2. Non-maximum suppression (peak detection) is used for this. Optimal detection with no spurious responses 2. Figure 6. 1986) is currently the most popular technique for image processing. which should be maximum at the peak of the edge data where the gradient of the original image is sharpest.1: Canny Edge Detection Process .1. It retains all the maximum pixels in a ridge of data resulting in a thin line of edge points. The second aim is for accuracy. FEATURE EXTRACTION 21 Canny Edge Detector The Canny Edge detector (Canny. Single response to eliminate multiple response to a single edge The first aim was reached by optimal smoothing. Canny demonstrated that Gaussian filtering was optimal for his criteria.6.

j + 3. Some papers [3. vi. These include 1-2.j+1 + k−1 k−1 k−1 3. set it k−1 k−1 k−1 k−1 k−1 k−1 k as the following value: vi.j + 3. 3-4 transforms and other more complicated 4 3 4 3 1 approximations.j . If no adjacent pixels are 1 (edge). The real Euclidean distance to pixels is too expensive to calculate and for most applications an estimate can be used. 25] have gone on to prove that approximations were sufficient 3 for the purposes of distance matching. The following images show a feature image and its corresponding distance transform. Set all feature pixels to zero and others to “infinity” before the first pass. vi. THEORY Non maximal suppression This essentially locates the highest points in edge magnitude. A 3-4 transform uses the following distance operator: 1 0 1 This matrix 4 3 1 shows why the transform is named as such. on each pass.3 Distance Transform Distance transforms are applied to binary feature images. vi+1. and adjacent distances 3 . Each pixel is labelled with a number to represent its distance from the nearest feature pixel. vi+1.j = min(vi−1. such as those resulting from edge detection. The diagonals are represented by 4 3 4 3.j+1 + 4. If there is an adjacent pixel labelled as 1 a lower threshold must be met to set a pixel to 1.j−1 + 3. The value of the distance transform increases as the distance is further from a feature pixel in the original image. 12.j+1 + 4. Given a 3x3 region a point is considered maximum if its value is greater than those either side of it. 6. vi−1. vi+1. a high threshold must be met to set a pixel to 1. Hysteresis Thresholding Hysteresis thresholding allows pixels near edge to be considered as edges for a lower threshold.22 CHAPTER 6.j−1 + 4. vi−1.j+1 + 4) Complete sufficient passes (k represents the pass number) . vi. The points either side of it on the edge are established with the direction information. A simple way to calculate a distance transform is to iterate over a feature image using to distance operator to find the minimum distance value for each pixel. Then for each pixel.

Her two pass algorithm [25] is popular and is used by the Open CV library. The mean distance of the edge pixels to template is then Figure 6. One simply scores each position by overlaying the edge data of the template as shown in figure 6. 6.3. the better the match. By completing this process for every image position in the region of interest . This gives a matching score. DISTANCE MATCHING 23 0 0 1 0 1 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Feature Image 6 3 0 3 0 3 8 6 3 0 3 6 7 4 3 0 3 6 6 3 0 0 3 6 7 4 3 3 4 7 8 7 6 6 7 8 Figure 6. More complicated faster methods exist. I) = 1 |T | t∈T dI (t) [5] where T and I are the features of the template and Image respectively.2: 3-4 Distance Transform (not divided by 3) Distance Transform until you have calculated the maximum distance that is necessary for implementation of the matching or other algorithm you intend to use.4 Distance Matching Distance matching of a single template on an image is a simple process after a feature extraction and distance transform. Borgefors is responsible for much of the early work on distance transforms. T represents the number of features in T and dI (t) is the distance between the template and image feature. The lower the score.6.3: Overlaying of Edge Image [3] calculated with Dchamf er (T.4.

There is one problem with the simple “forward” distance matching. if the template points are a subset of the image points it may score as highly as an exact match. If the template is missing features that are present in the image ie. If any of these scores fall below the matching threshold. THEORY a score is generated for each location.4 .6.4: Original Image Figure 6.6) illustrate how an incorrect match could occur due to these circumstances.24 CHAPTER 6. Figure 6. The template can be considered found. where the distance transform of the image. Thus the forward distance matching confirms the presence of template features in the image. but doesn’t confirm the presence of image features in the template. The following example (figures 6.6: Template . The template figure 6.6 is a sub-set of the image and fits the distance transform. is correlated against the feature extraction of the template.5: Distance Image Figure 6.

6. Figure 6.8) the score will be high. The only problem is if there are not sufficient pixels in the edge image. .4.7 demonstrates the relationship between a forward match (Feature Template to DT Image) and a reverse match (DT Template to Feature Image).8: Template Distance Transform template of the cross is overlayed on this distance transform (Figure 6. This should be eliminated by forward matching with a “sensible” template. DISTANCE MATCHING 25 6. If we revisit the example that caused errors in forward matching Figure 6.1 Reverse Matching A reverse match is often used to preclude these false matches.4. When the Figure 6. When we combine the forward and reverse match we can use the resulting score to reject or accept matches.7: Matching Techniques [4] we can see that its reverse matching score will be significantly lower.

the “chance” that you are measure the distance between the same features of the image and template increase. the scaling of the template can . THEORY 6. mx my represent the x and y coordinates of that pixel. Huttenlocher [1] and Johnson [26] have published papers describing the use of oriented edge matching in an image hierarchy. 1] Where: m is the α my − iy template.2 Oriented Edge Matching Oriented Edge Matching. mo the orientation and similar measures. Hα (M.4. or pyramid resolution search is a popular method for increasing the speed of a search based image recognition technique. therefore can be substituted into a Hausdorff matching algorithm.3 Coarse/Fine Search A Coarse/Fine Search. iy . and similar techniques. Conversely a pyramid resolution search scales (smaller) the image and template. I) = maxm∈M (mini∈I (max( for ix . io of the image. Gavrila et al [5] used a similar technique to increase matching accuracy of chamfer matching. This orientation match generalised Hausdorff Matching to oriented pixels. They further clarify that features in the template are present in the image.26 CHAPTER 6. The individual distance measures for each M type can be combined later. thus M templates and M feature images. It has the same general form as their definition of a Hausdorff measure. are useful in shape based matching. Thus each template edge point is assigned to one of the M templates. Their formula for calculating the Hausdorff distance took this extra orientation parameter and normalised it to be comparable with the location distance ))) [26. Templates are then matched with this extra parameter. Generally a coarse fine search involves decreasing the “steps” of the template search over the image if matching scores dictate. increasing the scale if the scores are sufficient. mo − io| 6. mx − ix .4. Each pixel now has a distance from the nearest feature pixel and an orientation distance from the nearest feature pixel. When using edge points the orientation can be binned into M segments of the unit circle. By splitting the features detected from the extraction into types and matching them separately. Oriented Edge Matching can evaluate to a distance measure between orientations. Gavrila suggests having M feature types. Though the calculation of the score for each position in a pyramid search requires less computational expense (less pixels).

9: Search Expansion create difficulties. mean that a reasonable match at a coarse search level might indicate an “exact” match at a finer level. the current threshold. the details of the signs are quiet fine. 6. and the threshold defining a match is θ. The thresholds can once again be set using a mathematical equation to ensure that templates are not missed. In this search they use a depth first tree search. In this scenario a number of resolution levels is covered concurrently with the levels of the search tree. at a particular search step. and the distance between the prototype template and its children. Figure 6. To not miss this possibility the threshold must be set according to: Tσ = θ − 2 ∗ ( σ )2 . In a distance based search the smooth results (compared to feature to feature matching). At each point the image is searched with prototype template p.9 shows the furthest the actual location (the cross) can be from the search (squares). the search is expanded at that point with the children nodes being scored. If the current resolution of the search is σ. DISTANCE MATCHING 27 Figure 6.4 Hierarchy Search The approach proposed by Gavrila [4. Then when using a distance measure. Thus the threshold for this point of the search is now Tpσ = θ − 2 ∗ ( σ )2 − worstchild. as in HCM. 5] is to combine a coarse/fine search with a hierarchical search.4. Where worst child = 2 . Tσ . if the score is below a threshold.6. can be set such that a match “cannot” be missed. Tpσ . In a matching scenario such as traffic signs. Thus 2 HCM has the excellent property that in a coarse fine search a “match” cannot be missed.4. Hence reducing their size can cause these details to be destroyed. and the furthest possible matching location. To ensure that Tpσ will not reject any possible matches two factors must now be taken into account: the distance between the location of the score.

called vertices.1 Graph Theory “A graph consists of a non-empty set of elements. and a list of unordered pairs of these elements.10. where C is the set of children of prototype p C = t1 .5 Tree/Hierarchy The hypothesis of this thesis is to prove that creating a hierarchy of templates will allow the matching process described above to be carried out in real-time on multiple objects. The vision most people have of graphs is a diagrammatic representation such as figure 6. 6. 6. Where points are joined by lines. u and v. THEORY maxti of C Dp (T. . u and v are said to be incident with e and correspondingly e is incident with u and v. which are Figure 6. e.” [27] This statement defines a graph. . but would obviously .5.28 CHAPTER 6. This is useful for small and simple graphs. . tc . Adjacency: Vertices. called edges. are said to be adjacent if they are joined by edge.10: Simple Graph [2] vertices and edges respectively. Tree’s are a specific type of graph fulfilling certain mathematical properties. Once again a match cannot be missed. . Graphs come in many different forms and have numerous properties and definitions associated with them. I). Only the applicable properties will be discussed here.

with n vertices labelled 1. they were revisited during work on chemical models in the 1870’s [28].3n. In this definition a directed graph refers to a set of vertices with edges that infer adjacency in only one direction.6. Some tree properties can be used to construct trees from graphs. A dissimilarity matrix is the n x n matrix in which the entry in row i and column j is a measure of the dissimilarity between vertices i and j. Processing graphs in a computer in this form is also generally inappropriate.” [27] A dissimilarity matrix is and adjacency matrix of a weighted directed graph. One type of these are called . Let G be a weighted.11) are connected graphs which contain no cycles. It is possible to take each vertex and list those that are adjacent to it in the column or row of a matrix.2. An adjacency matrix is defined as such: “Let G be a graph without loops.3. Another necessary definition ia a complete graph.5.”[27] 6. called a weight” [27]. TREE/HIERARCHY 29 be confusing for larger representations. Trees were first used in a modern mathematical context by Kirchoff during his work on electrical networks during the 1840’s. The adjacency matrix M(G) is the n x n matrix in which the entry in row i and column j is the number of edges joining the vertices i and j. This thesis will create a tree using traffic sign templates based on their feature similarity. directed graph without loops. Increasingly tree structures are being used to store and organise data. These systems are required to store large amounts of data and search them very quickly.n.2. This form is more suitable to mathematical and computational manipulation. Each edge is weighted with the similarity in that direction. “A complete graph is a graph in which every two distinct vertices are joined by exactly on edge. Multimedia and Internet based storage and search research is at the “cutting edge” of tree development.5. with n vertices labelled 1. The significance of trees has increased in recent years due to modern computers. A weighted graph by definition is “a graph to each edge of which has been assigned a positive number. This tree will then be searched using feature information extracted from an image..2 Trees Trees (Figure 6.

10 is greater than 9. Features The features of a tree are usually the attributes used to split the tree. image . creating all combinations of these and finding spanning graphs would be computationally too expensive. Finding a split amounts to determining attributes that are “useful” and creating a decision rule based on these [29]. the features are obviously the value of the number. From here the tree is constructed by moving up levels. Figure 6. i. A bottom-up approach to growing a tree starts with the leaves. An easier approach is to build the tree with a bottomup approach.e. therefore also greater than 7. Trees can be multivariate or univariate. Values are constant in relation to each other. because the templates to represent higher levels in the tree are not yet established. for instance 9 is greater than 7. This allows trees to be created easily. In a simple tree of integers. THEORY minimum spanning trees. i. Multivariate trees require combinational features to be evaluated at each node. When creating a tree the “programmer/user” needs to determine several variables/concepts before commencing. or the lowest level of the tree which will have only one edge connected to them. image 3 matches image 5 well. combining the templates at each level. These are not applicable in this application. There are systematic methods for finding spanning trees from graphs. the size of the desired tree and tree quality measures. Data such as image templates which are not ordered. These are the criteria for finding splits.11: Tree Finding Splits When building a tree it is necessary to split the data at each node.e. ordered.30 CHAPTER 6. features.

It is a process of stochastic optimisation. The effect will be to decrease the threshold used to determine whether to expand a search. This should ensure that the images within a group are similar and groups are dissimilar. but deeper trees can be more accurate (note: that is a very general statement). Multi-stage searches are perhaps beyond the scope of this thesis.5. Some techniques for obtaining correctly sized trees exist. A single threshold will not necessarily be possible in most situations. Size of Trees Obtaining trees of the correct size can be a complex issue. These include restrictions on node size. Tree Quality measures Tree quality could depend on size. optimisation of splitting criteria. TREE/HIERARCHY 31 7 matches image 5 doesn’t imply that image 7 matches image 3 more or less. Shallow trees can be computationally efficient. Restrictions on node-size allow the “user” to control the maximum size of a node. multi-stage searches and thresholds on impurity [29]. Simulated Annealing The optimisation technique used by Gavrila [4. 5] to optimise hierarchies (Maximise the tree quality) was simulated annealing. classification of test cases and testing cost [29]. especially considering cases where the sample size can affect necessary thresholds. Features used to find splits and create an image tree are in this thesis likely to be distance measures between images. This will often be application dependant. are more difficult to place into trees. There are many options for deciding the quality of a tree. The name originates . A simple method proposed by Gavrila [4. 5] for a distance matching image tree was to minimise the distance between images of the same group and maximising the distance between different groups. resulting in a more efficient search because less paths are tested.6. Thresholds on impurity allow only groups/spilts to exist that are above or below a certain value when the splitting criterion is used.

5] used a depth first search which requires a list of node locations to visit. It allows the search to jump out of local minimum by allowing “backwards” steps.12: Breadth First Search . A good way to visualise a breadth first search this is laying the nodes out onto horizontal levels. two are depth first search (DFS) and breadth first search (BFS. A BFS visits all the vertices adjacent to a node before going onto another one. A DFS works “down” the tree checking each path to the leaves before moving across. hence would not require this list of locations. A BFS checks across the tree first. This works on an exponential decay like temperature. where if the “backwards” change is not too expensive given the current “temperature” it will be accepted. Figure 6. They differ in their direction of search. The cooling process and the search algorithm are iterative procedures controlled by a decreasing parameter. Simulated Annealing was also used by [10] to recognise objects. The next level below must be searched before the search can move horizontally to the next template on the same level.32 CHAPTER 6. Every node on the current level must be searched before we can move onto the next level. Searching Trees There are several well-known search methods for trees. Searches with simulated annealing can be stopped based on search length. “temperature” or if no better combination is possible. THEORY from the process of slowly cooling molecules to form a perfect crystal. figure 6. A depth first search can be seen as working down the levels before going across. Gavrila [4.12).

the IPL Image Processing and Open CV libraries is also necessary. Algorithms and data structures are important.6 Programming This thesis requires a good knowledge of programming concepts and topics. PROGRAMMING 33 6.2. Specific knowledge of MATLAB. .6.6. Details of these are included in Appendix A. as are concepts of Object Orientated programming.

THEORY .34 CHAPTER 6.

Suitable devices were already available in the laboratory 35 .Chapter 7 Hardware Design and Implementation The hardware for this thesis should be simple and “off the shelf”. providing this platform is of a comparatively good standard. allows to filter that is built to work on any streaming media source. Asynchronous File Source and WDM Streaming Capture Device respectively for the two example sources above. This will allow Directshow to access the video with a suitable object. • Video streaming from a USB/Fire-wire device Where the video has been pre-recorded the only hardware required is the computer. Where it is being streamed the camera must be plugged into a port/card on the computer. The two practical medias are: • Video recorded on a digital camera and written to an MPG/AVI. No design decisions were required for the hardware. This proves the value of the IPL and Open CV libraries used. Showing that these libraries allow image processing on a general purpose platform. Microsoft DirectShow.

The automatic settings do not cater for high shutter speeds necessary. HARDWARE DESIGN AND IMPLEMENTATION Figure 7.36 CHAPTER 7. The focus would be fixed to the expected distance of sign detection.1: Block Diagram from GraphEdit 7.1 Camera In a commercial application a purpose built camera would be used. Due to the camera being pointed down the road the automatic focus would often blur the traffic sign. . or adjusted to focus on the region of interest. Several problems are evident with standard camcorders. which are difficult to adjust “on the fly”. This thesis used a standard digital camcorder. It would also be fitted with a telephoto lens to allow high resolution at a fair distance from the sign. forcing manual settings. A purpose built camera would be made to adjust automatically when set at high shutter speeds.

The features considered in this tree are the image similarities and dissimilarities. The best hierarchy is chosen by the best optimised scores. The technique outlined produces a single level hierarchy. These help to find splits based on thresholding these values. Each complete graph forms a group which can be added to a hierarchy. The description explains inputs and outputs to most major functions. files and their contents. It is then annealed until further optimisation is not possible. and shows the procedural design of the functions. The process tests several orders and optimises hierarchy solutions for each of these. describes the abstract data types.1 Hierarchy Creation The method of hierarchy creation is based on the graph theoretical approach outlined in [2] and the traffic sign specific application in [5]. The hierarchy is constructed taking groups in an arbitrary order based on weightings of group similarities.Chapter 8 Software Design and Implementation All the code for this thesis is included on the CD attached to this document and not as an appendix. The appendix A. It is a bottom-up approach and can be applied recursively with varying thresholds to generate a multilevel approach. Optimisation is defined as minimising intragroup scores and maximising intergroup scores. This seems the logical feature in a hierarchy for 37 . the algorithm involves grouping the images into complete graphs of 2. Briefly. 3 and 4 vertices. The similarities and dissimilarities are based on distance matching scores between templates.12 is simply a listing of directories. 8.

The distance transform of each image can be calculated using chamfer. i. The block diagram in figure 8.1: Hierarchy Creation Block Diagram Refinements were made to the exact methods of each sub-process during construction.38 CHAPTER 8. similar sign types were resized.2 is the design for the process used. all the diamond signs were made to be the same size. The images used for hierarchy creation were taken from websites of sign distributors. The size of the tree has been limited only by restricting node size to lesser the complication of application. checking if the extension is an image (. etc) if so it is loaded and the size tested. resulting in the following implementation. The list of files is iterated through.1 represents the process.bmp.e. The list is then iterated again zero padding any smaller images to the maximum size found in the last iteration and adding them all into a three dimensional vector.1 Image Acquisition The flowchart in figure 8. Figure 8.) This allowed quality pictures of signs to be included. This was the initial design. SOFTWARE DESIGN AND IMPLEMENTATION distance matching. 8. This is to find the maximum image size in the directory. (For other matching applications the images can be generated appropriately.1. . . Images were then acquired from a directory with a Matlab script. Matlab provides a simple files command to retrieve a list of files from a directory. Before the process commenced.m.jpg.

but simple calculation of the “chamfer” or distance transform (Figure 8. This is a very inefficient. vi−1.m The chamfer routine written for template distance transforms was inefficient but simple.j−1 +3. I) = 1 |T | t∈T dI (t) [5] where T and I are the features of the template and image respectively.j−1 +4.j .j +3. and all non-feature pixels to an effective “infinity” or maximum value greater than the maximum distance to be iterated too. values are approximated for corner pixels.2: Image Acquisition Flowchart Chamfer. vi+1. vi. vi.j+1 +4. Entry (i. Firstly all feature pixels are set to 0. Dissimilarity Matrix The dissimilarity matrix was calculated using average chamfer distance Dchamf er (T.1.3).j+1 +3. each time labelling each pixel with the result of: vi. T represents the number of features in T and dI (t) is the distance between the template and image feature.j +3. HIERARCHY CREATION 39 Figure 8. vi−1.j+1 +4) After this is complete.j+1 +4. The time taken for off-line distance transforms is not related to the speed of the matching. vi+1. vi+1.j) in the dissimilarity matrix represents the distance measure between template i and image j. vi. The algorithm iterates over the k image a certain number of times.8.j = k−1 k−1 k−1 k−1 k−1 k−1 k−1 k−1 k−1 min(vi−1. (Both being templates from the database) This .

The adjacency matrix was formed by thresholding the dissimilarity matrix. SOFTWARE DESIGN AND IMPLEMENTATION Figure 8. Groups were created by finding complete graphs within the set of images. These positions were allocated by the order files were retrieved from MATLAB’s files structure.4.3: My Chamfer Transform initialisation script is called createTree.40 CHAPTER 8. All values below the maximum distance are set to 1 indicating the images are similar (adjacent given this threshold). .2 Group Creation A Design Diagram is shown in figure 8. 8. This was effectively setting a threshold on impurity to control the properties of the tree. chamdata) From this point on in the software each image is referred to by its position in the images struct.m Inputs: • (optional) directory Output: • dissimilarity matrix • Images (MATLAB structure with fields edgedata. The graph of images was represented by an adjacency matrix.

Then in the same way triplets are used to find the quads. is specified as follows: Input: • Dissimilarity Matrix • Images (structure with fields edgedata and chamdata) • THRES a threshold of the average chamfer distance .m The MATLAB script for finding the groups. GROUP CREATION 41 Figure 8. The adjacency matrix can be searched to find the product terms contributing to the entries.m. The pairs can be used to find the third image. Effectively all the closed walks of length 2.1 Finding groups .setup. This also requires the adjacency matrix to be cubed. By using the adjacency matrix. setup. 8.2. Once again the diagonal shows if a triplet is present. The pairs found are used to find complete graphs of triplets.8. 3 and 4 through the connected sub-graphs of the set of images have been found.2. instead of the unthresholded dissimilarity matrix to create these groups we ensure any similarities are of sufficient quality.4: Group Creation Block Diagram Using the properties of adjacency matrices complete graphs of pairs can be found from the diagonal of the adjacency matrix squared.

m The createmps script.42 CHAPTER 8. • Images . is the intragroup score. as already mentioned. The arguments are as follows: Input: • Image Number for “root” image. Intragroup scores are found by testing all the templates in a group against their combinational template distance transform. hence make the best match (minimum) as bad as possible.2 Score Calculation . The worst (maximum) score of the templates against combinational template.m The scores referred to in the previous list are calculated by createtemps.2. The scores of similarity and dissimilarity are calculated using distance matching because this is the method of matching to be used.m.the structure containing all the image data. • Vector of Image Numbers for all in group. The combinational templates are formed using the createtemps script by creating a distance transform that is the mean of the images’ distance transforms. The goal is to reduce the intergroup similarity. creates a combinational template. The minimum score is taken to be the intragroup score because it represents the best match. Intergroup scores are calculated by comparing a combinational template to all the images not included in that template. Thus each group is scored by how badly it matches its template.createtemps. Createtemps. SOFTWARE DESIGN AND IMPLEMENTATION Output: • Imagestruct (structure with many fields representing groups and their intergroup and intragroup scores) • Adjsquare (the square of the adjacency matrix) 8. Output: .

m. iterate through the order to find the highest scoring match that will fit into the hierarchy for each image not already included (findbestnotin. Thus as explained when discussing “reverse” matching they match as well as the actual templates. If all templates for each group were saved the memory necessary would start to become ridiculously large.3 Hierarchy Creation An arbitrary order is used to guide the initial selection of the groups. It should ensure that combinational templates are subsets (or close to) of the templates.8. and one for the intergroup scores. one to represent the intragroup scores.the structure containing the template data (distance and edge) and scores • Tempscore . The features being used to create and optimise the hierarchy are the group size. The templates are not saved at this time simply the scores.5 shows the procedural design of the script. intergroup and intragroup scores. The scores are stored. It is morphologically thinned to ensure that all lines are of single width. Firstly. These scores are a measure of hierarchy quality. Once this has been completed all images with no possible groups. This process reveals the common features of the template.2. or those that haven’t already been included are added as single images. The hierarchy has two scores. Hierarchy Creation is performed by the MATLAB script createhier.2). GROUP CREATION 43 • temps . and templates recreated later. Inputs: • allpairs (vector containing all pairs already in hierarchy) • Imagestruct (same as before) • Order (arbitrary order of construction) .the template intergscore The combinational template is later thresholded to create the feature template for this possible tree node.m. The flowchart (Figure 8. a recursive implementation see figure A. Once the hierarchy is finished the groups scores and added together to form a hierarchy score. 8.2.

In this case it has been greatly simplified due to limited understanding of the mathematical concepts and time restraints. SOFTWARE DESIGN AND IMPLEMENTATION Figure 8.2. “Backwards” steps are not dependant on a “temperature” factor. The quality of the trees is measured by the similarity of images within a group (intragroup scores) and their dissimilarity to other groups (intergroup scores). otherwise the annealing process is finished. it is kept and the annealing process is continued. if partially built) • NOIMAGES (number of images) Outputs: • hierarchy (structure with groups. 5].4 Hierarchy Optimisation The hierarchies are optimised with a simple method similar to the simulated annealing used by Gavrila [4. A group is not allowed to be removed if it has no pairs.5: Hierarchy Creation Flowchart • Hierarchy (the existing hierarchy. it has already been removed or the last step was backwards. .44 CHAPTER 8. to a higher score. If the resulting hierarchy has a better score. scores and intergroup scores) • Scorevect (the cumulative totals for inter and intra group scores) 8. if the next score is then lower than the previous best the change is accepted. To avoid local minima the optimisation is allowed to take one step backwards.

The anneal function calls the remove function each pass. The optimisation is attempted for a variety of orders.m. A script temps2images.m Flowchart the ”best” hierarchy the templates are regenerated.m was programmed to automate the creation of multilevel hierarchies. For Figure 8. The hierarchy is created for each order using createhier.2.8. This is all represented by the flowcharts in figures 8.m script.5 Multi-Level Hierarchy To create a multi-level hierarchy the same functions are applied to the templates resulting from the combination of the leaf level images.3. This is done for multiple arbitrary orders to show the effectiveness of the optimisation.6: combinegroups. Alternatively if all the template images are written to files they can be accessed with createTree to form another hierarchy. GROUP CREATION 45 This optimisation takes place in the combinegroups. as only template scores were saved the first time they were created. it is then optimised with the anneal function. .6.4 and A. A. 8.2.

Which happens if your threshold is too low.3 MATLAB Prototype Matching A prototype matching system was created in MATLAB to help understand and refine the algorithm. This template could then be selected to test varying combinations of template and image. Run the createtree script (edited to use that directory) to get the images and dissimilarity matrix into the workspace. 2. and output as files the images of the combinational templates. To use this bottom-up hierarchy creation. which means it will fail if there are too many groups. The first approach taken was a simple distance match of one template to images. . Place the image files into the same directory. 3. Repeat this on the combinational templates for the next hierarchy level and so on. 5. They will be named based on the number of “root” of each group.2. in an easy development environment. you must recursively apply it to each level of the hierarchy. even on static images. This helped to refine several techniques and test possible approaches to speeding up the matching process. 8. Note: The algorithm is written recursively to make it easy (not efficient). or there are too many images. The basic chamfer matching algorithm was implemented using simple forward and reverse matching. 1. but was an excellent learning experience. 4. SOFTWARE DESIGN AND IMPLEMENTATION 8. Run combinegroups to combine and optimise these groups into a hierarchy. involving both forward and reverse matching. The final MATLAB matching system was different than the eventual Visual C++ real-time system. It was always destined to be slow and unusable. Combinegroups will show you each group. This led to the notion of masking the reverse search to avoid non-sign details affecting the match.6 Final Implementation The final implementation has been submitted with this thesis.46 CHAPTER 8. Use setup1 to create the groups that are used to optimise the hierarchy.

5. due to previous work proving it to be unreliable. Further methods tried to improve the matching included Localised thresholding. The thesis was not meant use colour information.3)of the edge detection also attempted to remove the tree data. so this approach was discontinued. They still caused unnecessary expansion of the search. 8.3.3. This was an attempt to reduce detail in areas of trees.5. Localised thresholding (A. Different thresholds (A.2) were also tried for different levels of the search. Simple colour detection and subsequent masking of the edge detection image by the colour information was tried.5. first encountered during ELEC4600 to increase the threshold in areas with high edge content. Oriented edge detection did not remove the trees as possible matches but increased their “random” appearance when compared to the well directed outline of traffic signs.7): . MATLAB PROTOTYPE MATCHING 47 Figure 8.1 Basic System The design of the basic fine/coarse single template distance matching system in MATLAB is as follows (Figure 8.8.7: Simple Matching System Effective matching was stifled greatly by trees. sub-sampling of the edge detection and oriented edge detection. Sub-sampling (A. but at a fine level helped reduce false matching.1)used simple statistical methods.

3.7. After the initialisation of variables the iterative for loop steps through each of the starting positions separated by 8 pixels vertically and horizontally. Ensuring the only features considered are those of the sign. If there were sufficient pixels in the edge image to indicate a “sensible” reverse match and the product of the forward and reverse threshold a match is considered to be found. For example if a sign is surrounded by Trees the edge detection may look as in figure 8. the search is expanded further on this location.48 CHAPTER 8. For each search position based on the step a forward score is calculated. The reverse search is only appropriate for areas included within the boundaries of a sign. else the search is terminated. The search still expanded unnecessarily on areas of noise. By masking the edge detection with figure 8. like trees. and gave some false matches. named expand. If these conditions are not met the search is terminated without a match being found. 8. This location is passed to a recursive loop. Setting the region of interest to the square Figure 8.m This file implements the design in figure 8. SOFTWARE DESIGN AND IMPLEMENTATION R10simplepyroverlay. If the step is one a reverse and forward score is computed for both locations.9. Following the theory relating to coarse fine distance matching the threshold is reduced as a function of the step. by recurring with a smaller step.8. not the background. If this forward score is below a threshold. As shown it takes a recursive approach to searching each of the starting locations. . This matched individual templates well.2 Masking Reverse Search This simple search was improved by masking the reverse search. The region of interest is changed to include solely inside the boundaries of the sign. which searches this sub-area. which are iterated over.8: Noise Behind Sign shape of the matrix will cause the tree edge detection to inflate the reverse score.

8.3. This implementation is much more complicated than the simple search as the hierarchy must be searched concurrently with the coarse/fine matching.3 Pyramid Search This pyramid search used the hierarchy object created by the MATLAB script described in the previous section.10) was used to search each group for a match: . MATLAB PROTOTYPE MATCHING 49 Figure 8.3.9: Reverse Matching Mask 8. The following design (Figure 8.

SOFTWARE DESIGN AND IMPLEMENTATION Figure 8.50 CHAPTER 8.10: Pyramid Search .

Refinements were need to the matching design to improve accuracy and precision. Oriented Edge Detection Oriented edge detection has been used by other researchers in matching problems.4. finding the maximums for each.1 is the section of code changed. The canny edge detector already estimates the direction of edges for use in the non-maximal suppression. Directions are binned based on the following diagram: % % % % % % % % % % 3 2 The X marks the pixel in question. There are eight divisions. The planned implementation was to modify the existing canny edge detection algorithm in MATLAB to produce a binned orientation map. By binning the values during this calculation the output from the canny edge detector could be scaled with different magnitudes representing orientations. MATLAB PROTOTYPE MATCHING 51 It was implemented in pyroverlay. but for the non-maximum supression we are only worried about 4 of them since we use symmetric points about the center pixel. and each of the quadrants for the gradient vector fall into two cases. Shown in A. Work on the pyramid search was very brief due to the poor results of the simple one template fine coarse search. Another function simply called each group. divided by the 45 degree line. I can then output a matrix with edges directionally coded into the magnitude (Figure 8. In one case the gradient O----0----0 4 | | O | (1)| X | 1 | O | |(4) vector is more horizontal. so the expectations were high.3.11). including hierarchical searches.8. . and in the other it is more vertical.m and was a simple iteration through each member of the group. O----O----O (2) (3) (From MATLAB image processing toolbox edge function) The edge function iterates over the directions. The edge pixels for each direction are placed in another matrix (three dimensional) directionmatrix.

j+1 + 4.12. vi+1. This rejected almost all of the false matches. after calculating the result of the minimum distance (split into positions which have 4 added to them and 3). Equating the following: diri. The code in k−1 A. but confirmation of matches used the forward. vi−1.j = dir(min(vi−1. but as this was a prototype implementation designed to test the algorithm it was not attempted. reverse and orientation matching scores. This was too expensive to perform on the entire image.j + 3.m. vi. vi. Directionchamfer. vi. vi−1.j+1 + 3.j . a position then inherited the direction of the pixel that its minimum distance was calculated from.j−1 + k−1 k−1 k−1 k−1 k−1 k−1 k−1 k−1 4.52 CHAPTER 8.j + 3.4.2 was iterated over the template image. so it was only implemented for forward matching against the distance transform of the template.11 are shown in figure 8.j+1 + 4. A more efficient distance transform method may have been possible.4 Directional Matching To match the oriented edge detection to the template requires a “oriented edge map” of the template.m was the script to perform this function. The oriented distance matching was implemented for single image coarse fine matching. vi+1. Positions were expanded based on the forward matching as before.j−1 + 3.j+1 + 4)) Results of this for the image shown in figure 8. By extending the distance transform to produce a matrix labelling every position with the direction of the closest template feature pixel allows a comparison of the distance between image and template pixels and a “distance” between their direction.3. allowing the results presented in my thesis seminar. . vi+1.11: Oriented Edge Detection 8. The script written was simplepyrdirectedoverly. SOFTWARE DESIGN AND IMPLEMENTATION Figure 8.

12: Orientation Map The orientation matching score was calculated using the following formula: For every pixel in the directed edge image subtract the value of the corresponding pixel in the direction map then Mod that with three.3. MATLAB PROTOTYPE MATCHING 53 Figure 8. This results in the following scores: .8.

1: Directional Scoring . SOFTWARE DESIGN AND IMPLEMENTATION Edge Score 1 1 1 1 2 2 2 2 etc... Orientation Map 1 2 3 4 1 2 3 4 Matching Score 0 1 2 1 1 0 1 2 Table 8.54 CHAPTER 8.

8.8.5.6 Final Implementation A final implementation for the MATLAB prototype Matching was not delivered.5 Rejected Refinements Some rejected refinements to the system are present in Appendix A.3. The work was left “unfinished” and implementation of the real-time system was started.3. . MATLAB PROTOTYPE MATCHING 55 8. The work was always intended to aid in understanding of the Algorithm.3.

The basic forward and reverse matching design would be implemented first.2 Object Oriented Design The system was designed in an object oriented environment.4. An abstract builder pattern is used to create the trees. These intended designs are the ideal situation where the search is handled within the template classes. SOFTWARE DESIGN AND IMPLEMENTATION 8. If the matching reaches the minimum step the hierarchy search is enacted. This will search the children of each root. based on the prototyping.10.13) was established prior to implementation. It is executed on each frame.10 shows the initial procedural design for the matching algorithm simplified to the main processes. If the leaf level of the tree is reached.13) shows the main classes and their main features important to the matching algorithm. This includes the edge image. The following initial design (Figure 8. EZrgb24 . expanding on the best match above the threshold. Reverse matching cannot be used until the leaf level of the matching process because combinational templates may not include every feature of the leaf template they represent. and is where the edge detection and distance transform will be effected.1 Matching Process Figure 8. The EZrgb24 class contains the image data. 8. They only contain the common features of their leaves. distance image and the output image.4. to be implemented was as figure 8. a reverse match is calculated confirming the presence.4 Real-Time The design for the real-time implementation would reflect the properties of the matching algorithm discovered in the prototype implementation. Each template root is forward scored against positions. The class diagram (Figure 8. Hence Object Oriented design concepts were used. If additional accuracy proved necessary this could be expanded to include orientation matching. 8. The procedural diagram of the matching algorithm.56 CHAPTER 8. Based on this score the position is expanded to include sub-positions. The transform method is part of the original example code.

actual implementations.13: Intended Class Diagram will also write the output to the stream. It has methods to score and search through the image hierarchy. and can use the same interface.8. The mytemplatev class contains and operate on the template data. The EZrgb24 object is responsible for creating the IplImage objects to represent the images. REAL-TIME 57 Figure 8. It . The mytree class is an abstract builder. Sequence Diagram The sequence diagram figure (A.4.7) shows the flow of control between process and objects. This design is simplified due to my limited knowledge of UML. Their are methods for EZrgb24 to access the data. Each template contains its distance and edge data and a pointer to its array of children. The builders are invoked to create the tree of templates. It has subclasses that are concrete builders. The constructor creates the mytemplatev objects in the appropriate hierarchy. Any of the concrete builders can be implemented at run-time. It allows other classes access to the hierarchy through the array of root templates. The mytree class allows the polymorphism to be used.

Separating the searches simplified the programming task.) 8. SOFTWARE DESIGN AND IMPLEMENTATION also creates a mytree object and instantiates it with whichever tree is necessary for the matching task. Reverse scoring is still run in this method. For each position to be search. until a match is found. transform runs the hierarchy search.58 CHAPTER 8. The class diagram A. (Not Shown in diagram.4. this design does not merge the coarse/fine search into the hierarchy search. Due to the limited nature of reverse scoring this memory leak is allowed to continue to demonstrate the intended design.3 Actual Design By reverse engineering the code actually written I am able to present the actual design implemented. This builder class creates the template hierarchy. It was much simpler to keep all references and use of this data within the transform method and not “pass” it to the template objects. The classes have also become too big. If the results were poor. but memory deallocation cannot be properly controlled without causing errors. The template class then takes care of forward and reverse scoring appropriately through the hierarchy (not shown in detail). If this were included as indicated in the theory. Scores and details of the found image are returned to the Ezrgb24 object to be written to the output. Obviously at the completion of each frame the transform filter will be run again. This change in design was necessary during the programming of the thesis. The diagram also shows more of the . For each frame the transform method is executed. The basics of tree building are kept the same in this design. Ideally in object oriented design the analysis phase should ensure that the classes are minimal and do not implement too much functionality. The mytemplatev class has almost become an abstract data type.8 of the actual implementation shows that the EZrgb24 class has responsibility for most of the scoring and searching methods. The IplImage objects require careful maintenance to prevent memory leaks. As can be seen. the threshold must be constantly modified by step and template to combination template parameters. it could have been added later. even as a reference to a static attribute. This design allows the template details to be completely hidden from the EZrgb24 object by encapsulation within the mytemplatev object. This (not shown in diagram) allocates the images with the calculated edge and distance transforms.

At the leaf nodes a reverse search is executed. This forward searches the root templates until the step is one. The function is greatly simplified because the transform method knows which template matches (because it generated the scores).5 Enhancements/Refinements The following enhancements and refinements were implemented on the real-time system: • Spiralling out from the centre of the ROI • Temporal Filtering to remove trees • Oriented Edges • Expanding all possibilities or best • Reverse Scoring • Truncating the Distance Transform These are explained in detail in Appendix A.4.8 .4. but without showing all the private methods needed. Sequence Diagram The sequence diagram A. 8. It can be easily seen it is overcomplicated. Firstly a coarse/fine search is executed with the root templates at each position. private variables. REAL-TIME 59 implementation details ie. and control resides mainly within the transform filter. This may be expanded to the hierarchical search based on the scores.7 explains in more detail how some difficult parts of the implementation were achieved. 8.9 reflects these changes made to the class diagram.4.8.4 Further Information Appendix A. The output is written similar to before.

such as lighting.4.6 Final Matching Algorithm Used. SOFTWARE DESIGN AND IMPLEMENTATION 8. The hierarchy is interchangeable with the traffic sign hierarchy thanks to polymorphism. the matching can be demonstrated and tested in real-time in the lab. in a very large font. due to false matches. And some of the refinements/bugs were discovered by preparing this easier case. The final matching algorithm implemented used the maximum.4. . Temporal filtering was also unnecessary. The hierarchy works on text of a known font. trees. Traffic sign footage contains many uncontrollable variables. by printed out copies. car movements. By creating a hierarchy of differently sized objects they can all be searched for simultaneously. truncated and scaled distance matching approaches. • Letter Matching • Size Variance Matching • Deformable Contour Matching Letter Matching Many of the improvements mentioned were discovered by using a simplified matching case. Oriented edge detection was not fully implemented in the real-time environment as match accuracy was reasonable. By creating a hierarchy of letters. and providing the letter images. Only the best match at each level of the hierarchy was expanded.60 CHAPTER 8. damage. that of letters. 8. The spiral search pattern was rejected as the matching may need to find multiple objects. occlusions. creating the templates using bitmaps of the letters. Size Variance Matching Hierarchical chamfer matching can be employed to create a size variant matching system. This system allowed the results of many of the refinements to be tested.7 Further Examples Some further examples were programmed to prove the possibilities of the matching algorithm.

. instead of different masks for subtrees (circle. In this application. as already mentioned. the example application realised used simple circles on a plain background. etc.1. for example (Figure A. a hierarchy can be created of similar rotational shapes.8. Most rotations bare a similarity to the previous and next rotation.4. where masking of the reverse search is used. diamond. . this would also require a different mask for each template. Due to this added complication.). are rotations of objects.. By exploiting this.14: My Size Variant Hierarchy Rotational Matching Another possible scenario. Figure 8. REAL-TIME 61 The hierarchy is generally formed by grouping similarly sized objects.

62

CHAPTER 8. SOFTWARE DESIGN AND IMPLEMENTATION

Chapter 9

Results
9.1 Hierarchy Creation

The hierarchy creation code could successfully optimise small hierarchies. When given a relatively low threshold, the results were not affected by the order of images therefore were optimised. If the threshold was high, the results were the same for groups of similar orders. The creation system was designed around recursive programming. This allowed problems to be simplified, however, it did not create an efficient system. Due to memory constraints large hierarchies and those with many combinations (i.e. low threshold) should be avoided.

9.1.1

Hierarchies

Selected results generated by the hierarchy creation are included.

Diamond Signs The hierarchy included in A.9.1 is of a sub-set of diamond sign templates to demonstrate the effectiveness of the automated hierarchy creation. The following commands were used: 63

64

CHAPTER 9. RESULTS

[diss, images]= createTree;

[imagestruct, adjsquare]= setup1(diss, images, 0.5);

[hierarchy, temps, options] = combinegroups(images, imagestruct, 0.5, adjsquare);

The scores achieved by each order before optimisation are presented in figure A.14. After optimisation it can be seen (figure A.15) that groups with similar starting orders have given the same score, hence some “optimisation” has been achieved.

Circular Signs

A Hierarchy of Circular signs was generated in a similar manner. The following figure represents the hierarchy created. The scores achieved are represented in figure A.36. They demonstrate

Figure 9.1: Circular Sign Hierarchy an “optimal” solution has probably been achieved, because no matter what order the hierarchy was created in the result was the same.

Others

Hierarchies for letters, multi-resolution and deformable contours are not included due to size restrictions on this document.

It produced similar scores for an exact match as it did for noise-like patterns created by the edge detection of trees. MATLAB MATCHING 65 9. increased the accuracy and precision of the match. The addition of reverse matching had limited success. Localised thresholding as a means to limit unnecessary expansion of the search.6). The ideas rejected include the spiral search pattern. The use of truncated distances was retained. Oriented edge were not implemented but results indicate that this or another stage may still be necessary for match verification. 9. such as trees. simple temporal filtering. The first major improvement to matching accuracy was made by the masking of reverse scoring (8.3. Which was able to match signs in static images. 9. and expanding “all” matches below the threshold.2. In much of the footage recorded . This still did not limit the search’s tendency to expand in places that contained dense edge information. Using additional oriented edge information. The matching algorithm is intolerant to poor edge detection.9.3 Real-Time Matching The results presented on traffic sign detection show that a real-time detection system based on Hierarchical Chamfer Matching built for a general purpose platform is a realistic goal.2). due to the exploitation of the canny edge detection.2. So did matching in different feature extractions. proved computationally expensive (Figure A. There was only a slight increase in computational expense. This refined the sign matching but still allowed unnecessary search expansion.1 Matlab Matching Results The diagrams in A. The development of the algorithm provided some insights into valuable enhancements.10 detail the results achieved in the MATLAB matching prototype. Using a Hierarchical search in this situation did not improve the matching.2 Matlab Matching The basic system of single template matching has limited possibilities.

Figure 9. 9. which means features are missing from the image and a good forward match is not possible. such as letter matching.6Ghz Pentium 4 with 256 Meg of RAM the frame rates varied bfrom over 20 frames/second in scenes were there was little noise to cause unnecessary expansion.1 Performance On a 1. The biggest problem affecting matching was the poor edge detection resulting from blurred footage.3) show the system output.3.2 Results Virtually all traffic signs that were edge detected without distortion were found.3. False matches were infrequent and limited to noisy sections. The images here (figures 9. RESULTS the sign is blurred as it nears the correct size for matching. These results were on video that was 360 × 288 pixels. probably due to automatic focussing of the camera.2: 50 Sign . to under 10 frames a second for more difficult scenes. 9.66 CHAPTER 9. In other examples. This blurring reduces the quality of the edge detection. the results are much better due to the controlled environment.2 and 9.

9.4 Size Variant Matching The demonstration of matching over 20 different sized circles. This demonstrated the algorithm’s ability to match rotations of objects.5 Rotational Matching A simple cross pattern was sampled at varying rotations and placed into a hierarchy. then using an object type hierarchy a very large number of shapes and sizes could be recognised. By creating a size hierarchy. This was once again in real-time. . a very robust detection system would be possible.3: 60 Sign 9. at a high frame rate.3 Letter Matching A hierarchy was created using the alphabet in a known font. This further demonstrated hierarchical matching. By combining rotations with scaling and skews in a large hierarchy. 9. REAL-TIME MATCHING 67 Figure 9. has shown that this algorithm could be suitable for matching an object of unknown size. Results proved that print-outs of letters could be matched when held in front of a USB camera.9. with virtually no false matches in an “office” environment.3.3.3.3.

With the knowledge gained during the thesis I could perform much better if the application were to be programmed again. experience and understanding of Image Processing Algorithms.2 Strengths/Weaknesses My main weakness was object orientated design and the ability to realise that design. 9.1 My Performance Skills Learnt During the course of this thesis I have learnt many new skills.68 CHAPTER 9. I also benefitted from investigating the OpenCV library in great detail.4.4. RESULTS 9. I learnt several graph theory concepts and the basics of constructing a tree. These included graph theory. This allows me to quickly evaluate possible approaches based on results of prototyping. distance transforms and matching metrics. . particularly edge detection. My Visual C++ programming skills were improved due to the complicated nature of the matching algorithm. image processing and object oriented programming. I was unable to build a well structured program. I greatly expanded my mathematical knowledge of image processing. My strength is knowledge.4 9.

1 Video Footage The fixation of a camera to the vehicle and the use of a more suitable camera would increase the consistency of footage. A clearer image would allow better edge detection and therefore better matching. • More consistent video footage • Temporal Information included • Better Object Oriented Design • Improved Hierarchy Generation • Optimisation • Final verification stage 10.Chapter 10 Future Development Several aspects of this thesis could be improved if future work was conducted. 69 .

70 CHAPTER 10. 10.6 Final Verification Stage A final verification stage such as using orientation information (similar to the MATLAB prototype). Design methods for speed of applications in the environment should be studied further. 5]. Larger hierarchies could also be created. and automatically generate the concrete builder in c++/pseudocode the application would be easier for developers to use. could be stopped. 10.5 Optimisation After the design was improved and the code simplified effort could be spent improving the efficiency. such as trees. Further improvements would be possible if unnecessary expansion caused by noise. but would increase the readability. Tracking of potential signs from frame to frame and use of the expected paths of traffic signs could help to speed up matching by pin-pointing likely locations. the quality of code may be increased. 10.2 Temporal Information The use of temporal information in the application would increase the quality and speed of matching. FUTURE DEVELOPMENT 10.4 Improved Hierarchy Generation If the MATLABtm system was able to handle larger image databases. Changes in hierarchy would be simpler.3 Better OO Design If the design were more complete and could be effectively realised. colour or neural network stage [4. 10. A faster application would allow larger hierarchies and hence more robust matching. This would not necessarily make it faster. .

The output of this can be used in both the static and real-time matching systems. This allowed various parameters and properties of the metric to be explored. The traffic sign matching application developed has proven that “smart” vehicle systems are not far away from mass production. It uses graph theory concepts derived from [2]. The hierarchy creation system that has been developed will create and optimise a structure. in particular traffic sign recognition. A real-time matching application was built utilising the IPL Image Processing and OpenCV libraries. The real-time system is dependant on quality video footage. recommendations 71 . It was also expanded to other matching scenarios. It is worthy of further investigation and development. It is incomplete but matches single templates on images with high accuracy yet poor time performance. Very few false matches are detected. As are results. but can produce excellent results. Relevant literature has been reviewed to provide theoretical basis for this thesis.Chapter 11 Conclusions This thesis proved that the hierarchical distance matching algorithm is effective for many image processing scenarios. again in MATLAB. This document contains the assumptions and a brief specification of the recognition system. Implementation and design details of both hardware and software are included. The matching was then prototyped. The goals of the thesis were achieved. It is capable of recognition up to 20 frames per second. A hierarchy creation system was implemented in MATLAB. A simple static matching system was developed to prototype the algorithm outlined in [4] and [5].

CONCLUSIONS for future work and conclusions. .72 CHAPTER 11.

Craig’s supervisor Associate Professor Brian Lovell said he hopes the device will become marketable. which hopes to develop Smart Cars that avoid pedestrians and remind you of the speed limits. He also told how the project would be demonstrated at a transport mission in Ireland later in the year. 73 . he said.1 Australia’s Innovators Of The Future Craig Northway’s Real-Time Traffic Sign Recognition project is based on the work of Daimler Chrysler Research. If the project is developed further. It involves the use of a camera mounted in the car. A computer processes the information into (sic) real time. vehicles could soon have the ability to warn drivers of pending situations or automatically take evasion action. 12.Chapter 12 Publication Extract from article to be published in UQ News on the 20th of October.

74 CHAPTER 12. PUBLICATION .

al. [4] D. “Multi-feature hierarchical template matching using distance transforms.” 1993. [5] D. [2] S. Logemann. 849–865. 10. 1996. 103–113. 439–444. pp. of the International Conference on Pattern Recognition. Haralick.” In Proc. Gavrila. “Graph-Theoretic Clustering for Image Grouping and Retrieval. 1998. et al. last viewed on 30/03/02. 109–223. “Real-time object detection for ”smart” vehicles. pp. [6] G. Olson and D. Philomin. Gavrila and V. 6.” Web Site.” 2000. 87–93.” In International Conference on Computer Vision. 1999. M. [8] J. P. [9] G.” IEEE Conf. Ballard. 1.. 75 .Bibliography [1] C. “Realtime traffic sign recognition. no.” Image and Vision Computing. 14. H. pp. 1988.” 1998. Aksoy and R. pp. vol. “Visual routines for autonomouis driving. Huttenlocher. Saligan and D. “Hierarchical chamfer matching: A parametric edge matching algorithm.” IEEE Transactions on Image Processing. C.” IEEE Transactions on Pattern Analysis and Machine Intelligence. 1997. pp. Borgefors. vol. 1999. on Computer Vision and Pattern Recognition. “An active vision system for real-time traffic sign recognition. et. vol.. M. “Robust method for road sign detection and recognition. “Fast Object Recognition in Noisy Images using Simulated Annealing. [7] J. M. [10] M. Betke and N. Markis. [3] G. “Automatic target recognition by matching oriented edge pixels.

“PLANAR IMAGE MOSAICING BY HIERARCHICAL CHAMFER MATCHING ALGORITHM. A. last viewed on 23/03/02. 1998.76 BIBLIOGRAPHY [11] D. [15] C. D. M. 614–624.” Proceedings of the IEEE Southwest Symposium on Image Analysis and Interpretation. Oren. Jain. 15. K. pp. 2000.” IEEE Transactions on Computers. 1997. M. of the IEEE Conference on Computer Vision and Pattern Recognition. “Comparing images using the hausdorff distance. K.” Web Site. Rogahn. 2001. “A real-time histographic approach to road sign recognition. Lu and A. 19–1–19–4. “A hierarchical multiresolution technique for image registration.” Third IEEE Computer Society Workshop on Perceptual Organization in Computer Vision (POCV01). pp. pp. of the IEEE Conference on ComputerVision and Pattern Recognition. 1995.” 1995. vol.” In Proc. [17] P.” IEEE Transactions on Pattern Analysis and Machine Intelligence. 7. [14] W. of the International Conference on Computer Vision. [21] R. and F. 850–863. 193–199. Turcajova and J. Szeto. [20] Q. [16] S.-L. no. C.-M. O. 427–435. pp. “A probabilistic formulation for hausdorff matching. pp. 1996. vol. “Face Detection in Color Images. E. 457–464. “Road sign detection and recognition. [18] R. [22] E. [13] G. H. King Sun. pp. K. “A non-parametric positioning procedure for pattern classification.” Pattern Recognition. “Locating objects using the hausdorff distance. Rucklidge. 1993. “Pedestrian detection using wavelet templates. Jr. Dhanaraks and N. vol. “Perceptual Grouping for Image Retrieval and Classification. Covavisaruch. Iqbal and J. [12] P.” In Proc.” 2001. Papageorgiou and T. 8. [19] Estevez and Kehtarnavaz.” 1998. Poggio. Katsky. 1993. Olson. Aggarwal. “Hierarchical Artificial Neural Networks for Edge Enhancement.” In Proc. . 1969. Rucklidge. S. H. C. Huttenlocher and W. 26. A.

[29] K. Abdel-Mottaleb. Hebert. On Growing Better Decision Trees from Data. J. Wilson and J. Jing Huang. pp. E. Johns Hopkins University. [25] G. .” Part of the IS and T SPIE Conference on Storage and Retrieval for Image and Video Databases VII. Johnson and M. CVPR ’97. 344–371. 1990. Graphs An Introductory Approach. Murthy.BIBLIOGRAPHY 77 [23] R. S Ravi Kumar. New York. Borgefors. [24] S. [26] A. J. 34. 427–435. 1986. pp.” 1999. 1999. [28] W. K. PhD thesis. Venkata and S.” Computer Vision. Prentice Hall. John Wiley and Sons. “An Automatic Hierarchical Image Classification Scheme. 1997. K.” IEEE Computer Vision and Pattern Recognition. “Distance transforms in digital images. M. Z. paul Tremblay. 1990. 1997. Grassmann and J. New Jersey. vol. Graphics and Image Processing. “Recognzing objects by matchin oriented points. Logic and Discrete Mathematics. [27] R. “Hierarchical clustering algorithm for fast image retrieval. Watkins.

78 BIBLIOGRAPHY .

Due to the reflective nature of traffic signs. 5]. If the camera was mounted on the dash of a vehicle. For demonstration purposes footage can be taken from a slow moving vehicle. A. In a smart vehicle system the camera must be able to resolve images at speeds of up to 110km/h. the signs would tend to pass through the same area of the footage (Upper Left in Australia). Once again in a smart vehicle system a high quality camera capable of low light filming would be used.Appendix A A.1. A.1.1 Assumptions Speed The camera must be able to resolve a sharp image from a fast moving vehicle.2 Lighting The lighting during the filming should be reasonable. This can prove the potential of the algorithm. detection can be performed at night using this method [4.1 A. such that once again the camera can resolve the image.3 Position It is reasonable to assume that the traffic signs consistently appear in a similar region of the video footage.1. 79 .

the image of the sign would be skewed severely. twisted or missing sections.1. If the car were in an extreme right lane. A.1.5 Damage Signs must be assumed to be undamaged. A.4 Angle The relative angle between the car and the sign is close to perpendicular. In the Daimler Chrysler system [4. without incorporating these skewed images into the hierarchy. as they are regularly maintained by local governments. A. The HCM algorithm is unable to rectify this situation. Figure A.1.80 APPENDIX A.6 Size Invariance Due to the Size invariant nature of the algorithm it must be assumed that the signs pass through this size(s) as the car approaches them without being obscured. Signs that have suffered damage may be bent.1: Multi-Resolution Hierarchy [4] . It is fair to assume that most signs are relatively undamaged. Small amounts of damage should not affect the matching. 5] sign templates are two sizes. reducing the chance of them being ”missed”.

oranges and potatoes are dissimilar. Traffic signs fulfil this assumption. .7 Computer Vision Functions The following assumption relates to the Computer Vision functions present in the IPL and Open CV libraries.A. If the functions are correct they should be able to produce an edge detection of the signs in most circumstances based on set thresholds providing the video meets the previous assumptions. are predictable shaped similar objects suitable for HCM.1. For example a hierarchy of fruits and vegetables might be unsuccessful. such as bananas there is sufficient variation with similarities to create a hierarchy of bananas alone. Even within one type of fruit. Thus Traffic signs.1. A.1. ASSUMPTIONS 81 A. text of known font. The Edge Detection should give a reliable single line outline of the signs. For HCM to be effective the shape of the objects should be similar. car outlines. The comparative shapes of bananas.8 Objects One major assumption must be made about the shapes to be detected. as there are a limited number of signs that are easily grouped into basic outline shapes.

. TV tuner. A graph of filters is created.3 IPL Image Processing Library The IPL Image Processing library was created by Intel to use their extended MMX instruction set.com/software/products/perflib/ijl/index.htm A. It can be compiled at run-time or pre-compiled into dlls.2. This could be hard coded or. decompressors etc.intel. are operated on by filters such as splitters. designed graphically.microsoft. This uses SIMD instruction sets to perform efficient operations on media such as audio and video. In this system streams originate. Image data structures . It is untyped allowing fast prototyping of algorithms.htm. and displayed by renderers or written to files by writers. SIMD stands for single instruction multiple data and is advantageous in situations where recurring operations are made to large amounts of data such as in signal processing. e. This has since been discontinued as a free download.82 APPENDIX A.intel. A. but not with sufficient structure or speed for extensive programming.2 Direct Show Direct show is Microsoft’s architecture for streaming media.g. Filters are joined by COM objects.com/default. are operated on and end in filters.4 Open CV The open source computer vision library is available free of charge for research purposes from http://www. by using an application from the SDK. It is a numerical mathematics package. Filter graphs start with a source.asp A.2. File Source. For more information see: http://msdn. For more information see: http://www.2 A.2.2.1 Programming MATLAB MATLAB will not be new to most electrical engineers. A. able to be programmed using m-files in language similar to C or Java.com/software/products/opensource/libraries/cvfl. USB Camera.

A. . PROGRAMMING 83 from the IPL imaging library are used extensively.2. It is a set of more complex image processing functions compared to the IPL library.

84 APPENDIX A. A.2: findbestnotin.3 Extra Hierarchy Implementation Flowcharts Figure A.m Flowchart .

3. EXTRA HIERARCHY IMPLEMENTATION FLOWCHARTS 85 Figure A.A.3: anneal.m Flowchart .

m Flowchart . Figure A.86 APPENDIX A.4: remove.

j+1) edge(i+1. end A.4. [threemin.mag). PROTOTYPE ORIENTATION CODE 87 A. m. j-1) edge(i. j+1)]).. e2 = bwmorph(e2. idxStrong = [idxStrong.. m)+1. j+1)]).3)*2)). (j-1+(threepos . e(idxWeak) = 1.ay. cstrong. fourpos] = min(fours). 8). else .1 Prototype Orientation Code Orientated Edge Transform for dir = 1:4 e2 = repmat(logical(uint8(0)).j) = threemin + 3.*(im2double(e2)).j)) newedge(i. threes = ([edge(i-1. 1). n).j) = direction(i. j) edge(i. % Thin double (or triple) pixel wide contours directionmatrix(:. ’thin’. idxWeak(mag(idxWeak) > highThresh)]. idxWeak = idxLocalMax(mag(idxLocalMax) > lowThresh). rstrong. if (min([threes fours])<40) [fourmin.4.:. idxLocalMax = cannyFindLocalMaxima(dir. if (fourmin > threemin) if (threemin < edge(i.dir) = dir.4 A.ax. e2(idxWeak) = 1. j) edge(i+1. rstrong = rem(idxStrong-1.2 Orientation Map fours = ([edge(i-1. threepos] = min(threes). if (threepos > 2) direction(i. cstrong = floor((idxStrong-1)/m)+1. j-1) edge(i-1.4. e2 = bwselect(e2.A. %this should create a direction map. j-1) edge(i+1.

end end else if(fourmin < edge(i.88 APPENDIX A.3)*2)).j)) newedge(i. direction(i.j) = direction(i+1.j) = direction((i-1+(threepos-1)*2).j) = direction(i-1. j).1)*2)). (j-1+(fourpos .j) = fourmin + 4. end end end end . else direction(i. if (fourpos > 2) direction(i. (j-1+(fourpos .

5. whereas in higher gradient areas the threshold could be increased to show fewer edges.5. where there are many different areas. This would allow edges to be found. even if the maximum gradients were very low. the threshold could be lowered to detect fine details. This is shown in this command: highThresh = min(find(cumsum(counts) > PercentOfPixelsNotEdges*m*n)) / 64. The standard MATLAB edge detection command when used without parameters adjusts the threshold such that 70% of the pixels are registered as on.A. A localised threshold Figure A. REJECTED PROTOTYPE IMPLEMENTATIONS 89 A. In examples such as this classic figure A. Where the image contains few “textures” A global threshold is excellent. with many different textures and lighting conditions is not appropriate.1 Rejected Prototype Implementations Localised Tresholding Localised Tresholding was investigated as a technique to remove the noise caused by trees.5: Simple Image for each section of the image would provide better edge detection.5. Assigning one global threshold for the whole image in a situation such as traffic sign recognition.5 A.7. where PercentOfPixelsNotEdges = 0. In areas of low gradients. The localised thresholds were to be applied using the canny edge detector. An informal statistical study of sections of tree image was conducted to see if any recognisable .

As the results of this thresholding method show (figure A. similar to the MATLAB default. lowthres = EX/max(image(:)).n] = size(image). The following code demonstrates the thresholding: sigma = std2(image). but raising it if the mean/standard deviation passed a certain threshold. if (lowthres >= thres) %std is really low . Unfortunately the same was true of the “inside” details of signs. Figure A. the major features were kept. thres = thres/max(image(:)). As expected.6. areas of tree contained high average gradients with high standard deviations. This was achieved by setting an average threshold. EX = median(image(:)). and this feature detection could have been used in the first stage of a matching hierarchy to determine sign type.%thres .6: Localised Thesholding characteristics of tree gradient detection could be found to raise the threshold in these areas. and rough position.90 APPENDIX A. thres = EX + 1*sigma. By localising the threshold based on the mean and/or standard deviation of the gradient image in that region it was possible to keep only the major features of the trees and signs.1*sigma/max(col). There were many edges and hence variations in gradients. [m. This would result in only major edges of the trees being found.

so a quicker method was sought.A.98. This was to ensure they are detected as early as possible.99. REJECTED PROTOTYPE IMPLEMENTATIONS 91 thres = 0. lowthres = 0.0. Even when a custom canny edge detection was implemented to allow the use of the same gradient image every time. A. It proved very difficult to sub-sample and retain the shape of the sign.05. were computationally expensive. It becomes expensive to produce multiple feature extractions of the same image in MATLAB. as they were being detected at the smallest possible size at which the features could be resolved with the edge detection. end if thres < minthres thres = minthres. . The custom canny edge detection did not perform well and was rejected in favour of other possible solutions. A.5. end Even though this provided a reasonable starting point for the search these calculations.5. lowthres = minthres . but the trees would “disappear” or become more random compared to the distinct outline shape of the traffic signs.3 Sub-Sampling By sub-sampling the edge detection it was hoped the general shape of the sign would remain.2 Different Feature Extractions A simpler method of implementing the same concept could have been using different levels of feature extraction for each level of the hierarchy.5. standard deviation and mean.

7: Intended Sequence Diagram .92 APPENDIX A.6 UML of Real-Time System Figure A. A.

8: Actual Class Diagram .A. UML OF REAL-TIME SYSTEM 93 Figure A.6.

9: Actual Sequence Diagram . Figure A.94 APPENDIX A.

CV_DIST_MASK_5. Especially when referencing them across classes.7 Code Details of Real-Time System Some of the functions from the IPL Image Processing Library were used and should be documented. 255). This can be done when scaling the image type from floating point to integer.. . NULL). Thus 5 will be ≈ 255. cvDistTransform(imghinv. So in these situations only the image data is deallocated. In some instances of referencing. Other complex sections of code where commenting may not be sufficient are shown. CODE DETAILS OF REAL-TIME SYSTEM 95 A. cvDistTransform(imghinv. iplMultiplyS(imghtempdist. iplThreshold(imghtempdist. 0. In this first example the data is scaled such that only distances rom 0-5 are included in the output. 0 = 0. CV_DIST_L2. iplScaleFP(imghgray32F. iplScaleFP(imghgray32F. etc. 255). 0. imghmult. iplAdd(imghtempdist. 10). A. imghtempdist. CV_DIST_MASK_5.7. It then truncated this at 5. 4 = 40. This will evenly place these values from zero to the maximum of the data type being scaled to. and scales it such that each distance is ten times its value. 4 ≈ 200 . imghtempdist). 5). Thus 5 = 50. This second example from the template creation scales the distances such that all values that could be represented by an 8-bit unsigned integer are output..7. imghtempdist.imghgray32F. imghtempdist.. The structure and header must always be deallocate.. CV_DIST_L2.2 Deallocation Deallocation when using IplImage objects seems difficult.1 Distance Transform When using a distance transform.7.imghgray32F. and as presented later. it is necessary to truncate the values. 1 ≈ 50.A. imghmult.. A. NULL). destroying the header has caused problems.

The variable names help describe the process as each template is named after the letters it represented. It is possible to use IPL IMAGE ALL as a parameter.7. Various constructors are available for the leaf and node templates. Create the root array 2. iplDeallocateImage(imgh). To code this the following procedure should be used: 1.96 APPENDIX A. A. Create the combinational template for this group with the array of leaves as its children array 4. Create the combinational template of the previous combinational template. IPL_IMAGE_HEADER). Repeat 5-6 as necessary 8. but if objects share IplROI’s this will cause errors as they are also deallocated. Point each of the root templates to the appropriate combinational template For examples see the letter hierarchies mytreea. iplDeallocate(imgh. It must be constructed in a bottom up approach. For each intermediate stage create an array of combination templates 6. . Repeat 2-3 for each leaf group 5. pointing to the array of combinational templates as its child 7. Create the arrays of leave templates 3. mytreel.3 mytree The method for creating a mytree concrete builder is not automated from the image hierarchy.

The pixel ordering is different. The MATLAB file templatecreate. TEMPX*3).7. Use cvSetData to set and IplImage object to point to the data. .tmp format. Then they can just be read with a FILE pointer into a BYTE array. cvSetData(imghmask. TEMPX*TEMPY. fread(p_datamask.A. p_datamask. p_filemask = fopen(maskname. CODE DETAILS OF REAL-TIME SYSTEM 97 A. p_filemask). TEMPY). 3).m converts the images to the *. BYTE *p_datamask = new BYTE[TEMPX*TEMPY*3]. 3. imghmask = cvCreateImageHeader(cvSize(TEMPX. "rb").4 Template Format The necessary format for images to be included in the template is an unsigned character file of each pixel (much like a bitmap). IPL_DEPTH_8U.7. fclose(p_filemask). FILE *p_filemask.

98

APPENDIX A.

A.8

Tested Enhancements/Refinements to Real-Time System

Spiral Design

In a traffic sign matching scenario, there are particular assumptions (already stated) that can be made about the location of a traffic sign. The sign is more likely to be at a particular height, in a particular horizontal area. This property can be used to increase the speed of the search. The following search pattern was designed:

Figure A.10: Spiral Search Pattern Compared to a simple search following this or similar pattern: It can be faster if stopped when

Figure A.11: Straight Search Pattern a match is found. Thus this search would only be of advantage if only one sign was assumed present and false matches could be guaranteed not to occur. If there are multiple signs to be detected, or false matches are likely the entire area should be searched and any advantage of the spiralling search is lost. This design possibility was ruled out due to the likely occurrences of false matches shown by the prototype matching application.

A.8. TESTED ENHANCEMENTS/REFINEMENTS TO REAL-TIME SYSTEM

99

Temporal Filtering to Remove Trees

Temporal Filtering was briefly investigated to remove trees. A simple subtraction of background from frame to frame would remove objects that change little. Due to the noise like nature of the tree edge detections, they are likely to change slightly regardless. The possibility of a sign being surrounded by tree, hence affect by this subtraction is also too great. Testing showed, that signs weren’t affected and trees were thinned of noise, but not sufficiently to make the overhead of background temporal filtering worthwhile.

Oriented Edges

Implementing the oriented edge algorithm prototyped in MATLAB, was briefly attempted. Modifying the Open CV source code proved difficult and time consuming due to the poor documentation and commenting. When the forward and reverse matching was completed, it was demonstration that orientation information was not necessary with other refinements.

Expanding all Possibilities or Best

At each level of the hierarchical search there are multiple methods of expanding the tree. The two main possibilities in this design are: ” Taking the best match above the thresholds at each level ” Taking every match above the thresholds at each level The initial design takes the best match at each level. Due to the small (< 50) nature of the hierarchies in this example it is unlikely the incorrect path will be chosen by taking the best option. In practise this appeared to provide sufficiently accurate results. This will also improve the efficiency of the matching. Taking every match would require a global knowledge of the results of each “thread” of the recursive search. At the end of the matching process, if multiple signs (leaf nodes) had been detected, they would need to be compared. By allowing only one “thread” at each level the leaf template found can be displayed (by copying onto the output), and then the process can iterate to the next position, with no knowledge of which template was found.

100

APPENDIX A.

Reducing the truncated distance

Initially the design used the distance transform straight from the cvDistTransform function. As opposed to the MATLAB implementation this calculated, using the two pass method [25], the distance to “inifinity” i.e. the furthest distance in that image. The MATLAB implementation only used a set number of iterations. Comparative one dimensional cross sections of these distance transforms of a point would be as follows: It can be seen that points a fair way from

Figure A.12: Untruncated Distance Transform

Figure A.13: Truncated Distance Transform the edge detection are still given a high value in the untruncated situation. In the truncated case, pixels beyond a certain distance are discounted, similar to a weighted hausdorff matching technique. By only allowing pixels close to the object to score, poor matching features get ignored, increasing the accuracy of the matching.

Maximum vs. Minimum

The search was tested as a search for the maximum match and minimum. Maximum matching inverting the distance transform. This has the benefit of not weighting pixels that don’t match as the truncated pixels are zero.

No noticeable difference could be seen between either. By scaling this to 250. 230. 1. Scaling the distance By scaling the inverted truncated distance I can control the ”weighting” given to pixels relative to the threshold.. 255. 240. Non-linear functions (including the truncation mentioned earlier) could be applied to effect the matching. or other similar amounts. 253. If pixels outside the truncated distance are scored as the maximum of the image type.. . they will contribute nothing but their presence in the average.8. They can still weight the score. I can control the weighting of pixels that are scored.A. and little accuracy over the scale is given. 254. due to the poor resolution.... Missing features can destroy a match. TESTED ENHANCEMENTS/REFINEMENTS TO REAL-TIME SYSTEM 101 The distance matching scores are an average per pixel... i. If they are given zero. 2. If I were to use 255. to represent 0.e.

A.9.14: Original Scores .102 APPENDIX A.9 A.1 Hierarchy Results Diamond Signs Figure A.

9.15: Optimised Scores . HIERARCHY RESULTS 103 Figure A.A.

29. Figure A.104 APPENDIX A.16-A.16: First Group Figure A.17: First Group Template Figure A. The following leaf level groupings were generated figures A.18: Second Group. template = self .

HIERARCHY RESULTS 105 Figure A.19: Third Group.9.21: Fourth Group Template .A.20: Fourth Group Figure A. template = self Figure A.

24: Sixth Group .22: Fifth Group Figure A.106 APPENDIX A.23: Fifth Group Template Figure A. Figure A.

A.27: Seventh Group Template Figure A.9.25: Sixth Group Template Figure A. HIERARCHY RESULTS 107 Figure A.28: Eigth Group .26: Seventh Group Figure A.

108 APPENDIX A. Figure A.29: Eight Group Template .

upon closer inspection it can be seen that despite their likeness neither have similar features aligned with the sign outline. The Crossroad sign was still “by itself” and the last grouping was of the 7th and 8th groups (figure A.9. the next level of the hierarchy is generated.30: First Template Group Figure A. By applying the same commands on the template images. Figure A.31: First Template Group Combinational Template .30).A.34).32). 2nd and 4th groups (figure A. The first grouping was of the 1st. Though the crossroad image has not been placed in a group with the left side road image.The second group was of the 3rd and 5th groups (figure A. HIERARCHY RESULTS 109 It is apparent from inspection of the groups that the optimisation is “sensible’.

32: Second Template Group Figure A.33: Second Template Group Combinational Template Figure A.34: Last Template Group Figure A. Figure A.35: Last Template Group Combinational Template .110 APPENDIX A.

36: Second Level Optimisation .9.2 Circular Signs Scores Figure A.9. HIERARCHY RESULTS 111 A.A.

38) and distance transform (figure A.37: Original Image Figure A.38: Oriented Edge Image This diagram shows the scores achieved at points throughout the image.112 APPENDIX A.37). A. The unnecessary expansion of the search over noisy areas of the edge detection can be seen (figures A. Figure A. the oriented edge detection (figure A.40 A.10 Matlab Matching Results This is the original image (figure A.41) .39).

A. MATLAB MATCHING RESULTS 113 Figure A.10. .39: Distance Transform Image and the match (figure A.42).

42: Match .41: Closer View of scores Figure A.40: Scores Figure A.114 APPENDIX A. Figure A.

the MATLAB matching code in the “matching” directory. CD 115 File alreadyin chamfer combinegroups createhier createtemp createTree directionchamfer. .1: Hierarchy A.12 A.12. retaining the directory name of the example it is based on.11. A.1 Code Listing All code is in the directory “codelisting”. and the real-time code in “EZRGB24”.m Description Checks if the value is already in an array Performs a 3-4 chamfer Transform Combines the groups Creates the hierarchy Creates templates Imports files from directory 3-4 Chamfer Transform also produces direction information findbestnotin remove setup1 Finds the best group not in the hierarchy Removes a group from a hierarchy makes all the pairs.A. triplets and quads Table A.11 CD Included on the CD (with the code) are a PDF version of this document and demonstration footage of matching. The MATLAB Hierarchy creation code is in the “hierarchy” directory.

3: Real-Time Hierarchy MATLAB Matching This code is “unfinished” hence only useful files are described. Real-Time Header files are also included. Templates have been included in the “templates” directory under coexisting but not described.m simple pyramid overlay with edge direction simplepyroverlay.simple finecoarse overlaying Table A. File nonmaxsuppression R10Simplepyroverlay overlay Description Simple Non-maximal Suppression fine/coarse search using one template takes an image and a template and simply translates the template across the image simplepyrdirectedoverlay.m) is in the “coexisting” directory.m simplepyroverlay . The file for creating templates (templatecreate.116 APPENDIX A. .2: MATLAB Matching File EZRGB24 mytree mytemplatev mytreev mytreel Description The Filter Abstract Builder for Trees Template Class A Concrete Builder Another Concrete Builder Table A.

The EZRGB24 example which this is based on needs the baseclasses as referenced in the project settings.A. See Brian Lovell’s (lovell@itee.au) notes from ELEC4600 on compiling direct show applications for more information.12.dlls at run time.2 Compilation Compilation of the Visual C++ code requires the installation of Microsoft’s DirectX SDK. To run the code templates must be found (Hidden in codelisting/templates/temps). CODE 117 A.lib files for IPL and OpenCV at compile and the appropriate *.edu. IPL Image Processing Library and OpenCV. It must also be able to find the *. Include paths must be set.12. .uq.

Sign up to vote on this title
UsefulNot useful

Master Your Semester with Scribd & The New York Times

Special offer: Get 4 months of Scribd and The New York Times for just $1.87 per week!

Master Your Semester with a Special Offer from Scribd & The New York Times