Professional Documents
Culture Documents
Lecture 1
Hough Transform, Model Fittin
Hw3 has been poste
Next Clas
Planar Unwarping
1
s
3
fi
:
slope
y n = (-x) m + y
P(x,y) n
intercept
x m
Parameter Space
4
.
fi
Img-Param Spaces
Image Spac Parameter Spac
Lines Point
Points Line
Collinear points Intersecting lines
5
s
6
.
7
.
Practical Issues
The slope of the line is -∞<m<
The parameter space is INFINIT
The representation y = mx + n does not express lines of the form
x=k
8
∞
Solution:
Use the “Normal” equation of a line:
y = mx + n ρ = x cosθ+y sinθ
y
θ Is the orientation of the normal
ρ Is the distance between
the origin and the line
P(x,y)
y ρ
θ
x
x
ρ <0, θ>0
ρ >0, θ<0
10
0
Consequence:
A Point in Image Space is now represented as a SINUSOI
ρ = x cosθ+y sinθ
11
Hough Transform Algorithm
Input is an edge image (E(i,j)=1 for edgels
1. Discretize θ and ρ in increments of dθ and dρ. Let A(R,T) be
an array of integer accumulators, initialized to 0
2. For each pixel E(i,j)=1 and h=1,2,…T d
1. ρ = i cos(h * dθ ) + j sin(h * dθ
2. Find closest integer k corresponding to
3. Increment counter A(h,k) by on
3. Find local maxima in A(R,T)
12
)
13
fi
!
15
tokens votes
16
17
18
19
Hough Transform for Curves
The H.T. can be generalized to detect any curve that can be
expressed in parametric form
y = f(x, a1,a2,…ap
a1, a2, … ap are the parameter
The parameter space is p-dimensiona
The accumulating array is LARGE!
20
)
HT for Circles
y
R
(a,b)
R
a
21
HT for Circles
y
θ
R
(a,b)
R
a
22
Generalizing the H.T.
The H.T. can be used even if the curve has not a simple analytic form!
yc = yi + risin(αi)
23
φ1 (r11,α11),(r12,α12),…,(r1n1,α1n1)
φj φ2 (r21,α21),(r22,α12),…,(r2n2,α1n2)
αj
rj . .
. .
(xc,yc) φi .
ri αi .
Pi φm (rm1,αm1),(rm2,αm2),…,(rmnm,αmnm)
xc = xi + ricos(αi)
yc = yi + risin(αi)
H.T. table
24
25
H.T. Summary
H.T. is a “voting” scheme
points vote for a set of parameters describing a line or curve
The more votes for a particular se
the more evidence that the corresponding curve is present in
the image
Can detect MULTIPLE curves in one shot
Computational cost increases with the number of parameters
describing the curve.
26
.
Segmentation by
Fitting
Fitting Lines
Using Least Squares Fitting Error:
28
Line tting can be max.
likelihood - but choice of
model is important
29
fi
Fitting Lines
Eigenvalue
Problem!
30
31
?
fi
s
32
Slide by D.A. Forsyth
33
Inliers-Outliers
34
fi
fi
Problems with Outliers
Least square estimation is very sensitive to outliers! :
Few outliers can GREATLY skew the result.
35
36
37
38
39
Outliers are not the only problem
Multiple structures can also skew results
The tting procedure implicitly assumes ONE instance
40
fi
.
Robustness
As we have seen, squared error can be a source of bias in the
presence of noise point
One x is EM - we’ll not do this in this clas
Another is an M-estimato
Square nearby, threshold far awa
A third is RANSA
Search for good points
41
fi
C
Robust Estimation
Two steps
Classify data into INLIERS and OUTLIER
Use only INLIERS to t the mode
42
:
fi
l
Parameter
Controlling
Error in uence
Residual error a
xi Set of parameters
Being tted
43
fi
fl
44
Issues:
45
46
Too large sigma
47
Good sigma value:
48
RANSAC
RANdom SAmple Consensus
RANSAC procedure (line example)
50
RANSAC procedure (line example)
51
RANSAC procedure (line example)
52
RANSAC procedure (line example)
53
RANSAC procedure (line example)
54
RANSAC procedure (line example)
55
RANSAC procedure (line example)
56
RANSAC procedure (line example)
57
RANSAC procedure (line example)
58
RANSAC procedure (line example)
59
RANSAC
Choose a small subset uniformly at rando
Fit to tha
Anything that is close to result is signal; all others are nois
Re
Do this many times and choose the best
60
fi
t
ISSUES
61
63
64
65
67
69
Line example
12 pts: n = 12
Sample size: s = 2
Outliers 2: e=1/6 -> 20%
N = 5 gives 99% chance of getting a good sample
(trying every possible pair requires 66 trials!)
70
71
After RANSAC
72
73
fi
d