3D Measurement Technology by Structured Light Using

Stripe-Edge-Based Gray Code
H B Wu, Y Chen, M Y Wu, C R Guan and X Y Yu
College of Measurement-Control Tech & Communications Engineering, Harbin
University of Science and Technology, Harbin, 150080, China
E-mail: woo@hrbust.edu.cn
Abstract. The key problem of 3D vision measurement using triangle method based on
structured light is to acquiring projecting angle of projecting light accurately. In order to
acquire projecting angle thereby determine the corresponding relationship between sampling
point and image point, method for encoding and decoding structured light based on stripe edge
of Gray code is presented. The method encoded with Gray code stripe and decoded with stripe
edge acquired by sub-pixel technology instead of pixel centre, so latter one-bit decoding error
was removed. Accuracy of image sampling point location and correspondence between image
sampling point and object sampling point achieved sub-pixel degree. In addition, measurement
error caused by dividing projecting angle irregularly by even-width encoding stripe was
analysed and corrected. Encoding and decoding principle and decoding equations were
described. Finally, 3dsmax and Matlab software were used to simulate measurement system
and reconstruct measured surface. Indicated by experimental results, measurement error is
about 0.05%.
1. Introduction
Optical 3D measurement technology is one of the most effective methods to acquire object 3D
information. It belongs to non-contact measurement with advantages of non-contact to measured
surface and high sampling density. Among the methods, encoded structured light is to be used widely
in fields as 3D reconstruction and industrial measurement because of its advantages as high accuracy,
high measuring speed, low cost and so on.
In the premise of acquiring system parameters by calibration, it is the key problem in encoded
structured light method that determining image sampling point and corresponding it with object
sampling point and encoding stripe region (namely projecting angle) in encoding patterns. Encoding
method can be sorted to time encoding, space encoding and direct encoding with their unique merits
and drawbacks.
Existing time encoding method divides projecting angle by binary code or Gray code, that
sometimes are combined with phase shift [1,2] or hierarchical [3] orthogonal to subdivide projecting
angle. Methods above adopt pixel as image sampling point that named pixel centre decoding.
Binary code may have several different bits between alternate code values. In course of decoding
some pixels may be situated at stripe edge several times in intensity images, so their code values could
be misjudged several times. If high bit is misjudged, the decoding error is larger. Gray code has only
one different bit between random alternate code valuesˈand each bit has same weight. In course of
Institute of Physics Publishing Journal of Physics: Conference Series 48 (2006) 537–541
doi:10.1088/1742-6596/48/1/101 International Symposium on Instrumentation Science and Technology
537 © 2006 IOP Publishing Ltd
decoding, random pixel is situated at stripe edge at most once in intensity images, so its code value
could be misjudge at most one bit, and decoding error caused by random bit misjudged is only one bit.
However influence to acquiring projecting angle accurately caused by one-bit decoding error is hard to
be removed.
In order to remove one-bit decoding error of Gray code based on pixel centre, and increase
accuracy of image sampling point location and correspondence between image sampling point and
object sampling point, encoding and decoding method based on stripe edge of Gray code is presented.
2. Encoding and decoding method based on stripe edge of Gray code
The method adopts black-white Gray code encoding pattern. In course of decoding, it is different to
decoding by pixel centre that the method locates stripe edge in each intensity image (before
binarization) by sub-pixel location technology, then adopts points in edge as image sampling points
whose grey values (0 or 1) in intensity image (after binarization) are used to acquire Gray code. Gray
code value is used to determine the corresponding relationship between edge in intensity image and
encoding pattern, and acquire its projecting angle. The method includes two steps.
2.1. Acquiring stripe edge ordinal number in intensity image
The purpose of this step is to acquire edge ordinal number in encoding pattern corresponded by stripe
edge in intensity image. As is shown in figure 1, for example, four Gray code patterns are projected,
and there generate 2
4
-1=15 edges. When acquiring edge ordinal number in the forth intensity image,
Gray code is determined by grey values (0 or 1) at position corresponded by the edge in former
(namely 1, 2, 3) intensity images (after binarization), then edge ordinal number is figured out by
equation (1).
( ) ( )
1
10 2
1 2 1 0
2 2
+ ÷
÷
÷
+ =
i n
i
i n
G G G G k " (1)
Where k=1, 2…2
n
-1 is edge ordinal number; n is total number of intensity image; i=1, 2…n is
ordinal number of intensity image; G
i
is grey value in intensity image i, thereinto G
0
=0DŽ
15 14 13 12 11 10 9 8 6 7 5 4 3 2 1
Ordinal
number
4
3
1
Time
sequenc
2

o
1
o
14
x
A
B
C
D
E
F
G
S
T
o
13
o
12
o
11
o
13
o
11
o
12
o
14
o
0
Figure 1. Stripe edge ordinal number. Figure 2. Even-width-stripe encoding.
2.2. Acquiring projecting angle corresponded by edge in encoding pattern
According to edge ordinal number k figured out by equation (1), projecting angle corresponded can be
acquired. Projecting angle range of projector is set 2o
1
and angle between projecting centreline and
axis x is set o
0
. For example, three Gray code patterns are projected, thereby position of seven stripe
edges in projecting angle is shown in figure 2. Where A, B, C, D, E, F, G is position of stripe edges
whose ordinal number is 1, 2, 3, 4, 5, 6, 7.
Encoding by even-width-stripe pattern has advantage that easy realization, while disadvantage that
dividing projecting angle irregularly, so projecting angle range corresponded by thinnest-stripe
deduces from centre to both sides. As is shown in figure 2, thinnest-stripe width SA=AB=BC=CD=
538
DE=EF=FG=GT, while projecting angle o
11
>o
12
>o
13
>o
14
. If regions corresponded by thinnest-stripes
are looked as even angel, projecting angle error that causes reconstructed surface bend in both sides
rises.
Even-width-stripe projecting angle can be figured out by equation (2) that corrects the error above.
Putting edge ordinal number k into equation (2), projecting angle corresponded is acquired.
( )
(
¸
(

¸

÷ + =
÷
÷
1
1 1
0
2
tan
2 arctan
n
n
k
o
o o (2)
Encoding and decoding method based on stripe edge of Gray code adopts Gray code value of point
in edge in former intensity image to correspond it with that in encoding pattern. As is shown in figure
1, edges marked by broken line are all situated at stripe inner not edge in former intensity images, so
their code values are hard to be misjudged. The method removes the one-bit decoding error in Gray
code based on pixel centre theoretically.
Decoding by pixel centre, one pixel is corresponded by many object sampling points, so its grey
value can’t be determined accurately, while by stripe edge, corresponding accuracy between image
sampling point and object sampling point can achieve sub-pixel degree.
3. Stripe edge sub-pixel location
Sub-pixel location technology as fitting, grey square, resampling, space square, interpolation is widely
used this year [4]. This paper adopts fitting method to acquire stripe edge in intensity image. Because
stripes in encoding pattern are vertical and those in intensity image are the same, so they are detected
by horizontal scanning. Intensity image is filtered by equation (3), and grey value f(j) is alternated by
g
4
(j).
( ) ( ) ( ) ( ) ( ) 2 1 1 2
4
+ + + + ÷ + ÷ = j f j f j f j f j g (3)
(a)
(b)
(c)
Partical
maximal value
Partical
minimal value
Start
Input image
Filtering
Sub-pixel location by
horizontal scanning
Intensity image binarization
Detect Gray code, edge ordinal
number and projecting angle
Figure out sampling point
space coordinate
Surface reconstruction
End
Figure 3. Sub-pixel location by horizontal scanning. Figure 4. Flow chart for reconstruction.
Where j is pixel row ordinal number. Figure 3(a) is intensity image whose line i is scanned, and
grey value of each pixel is shown in figure 3(b). Fitting change curve of grey value after filtering, as is
shown in figure 3(c), average of wave crest and wave hollow alternate to the edge is figured out.
Finally, its corresponded horizontal position in fitting curve is determined as sub-pixel location
position.
539
4. Reconstruction experiments
Measurement system was composed of surface light source projector, camera and measured object
simulated by 3dmax software [5]. Thereinto, projecting angle range of black-white camera was 40°,
and CCD resolution was 1024×1024; Projector projected continuous encoding pattern whose
projecting angle range was 30°. System range was about 240×240mm. Matlab software was used to
process image, calculate space point coordinate and reconstruct measured surface. Program flowchart
is shown in figure 4.
Flat with depth value z=508.499mm was reconstructed by Gray code based on pixel centre and
reconstruction experimental result and error flat are shown in figure 5(a). Those by Gray code based
on stripe edge are shown in figure 5(b). Obviously, depth values of object sampling point with small
projecting angle in flat reconstructed by Gray code based on pixel centre diminished (close to camera),
while that by Gray code based on stripe edge augmented (apart from camera), because the irregular-
dividing error of projecting angle was not corrected. Depth values in flat reconstructed by Gray code
based on stripe edge did not deflect.
(a)
(b)
Figure 5. Plat reconstruction experimental results.
Specific reconstruction errors are shown in table 1.
Table 1. Reconstruction experimental findings.
Z
max
(mm) Z
min
(mm)
Maximal
error (mm)
Relative
error
Variance
Gray code by
pixel centre
509.770 507.103 1.271 0.25% 1.408
Gray code by
stripe edge
508.753 508.345 0.254 0.05% 0.035
According to table 1, measurement accuracy by Gray code based on stripe edge are higher than that
by Gray code based on pixel centre.
540
Complex 3D model was reconstructed by the method and system in this paper. Venus plaster model
and its reconstruction experimental results are shown in figure 6(a) is measured model, and (b), (c) and
(d) are reconstruction experimental results by multi-angle that can reflect measured surface truly.
(a) (b) (c) (d)
Figure 6. Reconstruction experimental results of Venus plaster model.
5. Conclusion
Method for encoding and decoding of Gray code based on stripe edge is presented. The method
adopted key technology as decoding by Gray code stripe edge, acquiring stripe edge by horizontal
scanning sup-pixel location, so one-bit decoding error of Gray code could be removed, as the same
time, advantage that high adaptability to steep part at measured surface was reserved; Object sampling
point and image sampling point corresponding point by point, quantization error caused by pixel-
centre decoding could be removed; Errors that reconstruction surface bending at both sides of
measured object caused by dividing projecting angle irregularly was corrected. According to
simulation reconstruction experiments, the method proved effective. Future plans include acquiring
higher sampling density by reducing encoding stripe width, and reducing influence to measurement
caused by shelter by multi-angle measuring and joint.
Acknowledgements
The support of National Natural Science Foundation of China under research grant 60572030,
Specialized Research Fund for the Doctoral Program of Higher Education under research grant
20050214006, Heilongjiang Province Education Department Overseas Scholars Science Research
Foundation under research grant 1055HZ027, The Key Science and Technology Research Project of
Harbin under research grant 2005AA1CG152 and Heilongjiang Province Graduate Student Innovation
Research Foundation under research grant YJSCX2005-238HLJ are gratefully acknowledged.
References
[1] Joaquim Salvi 2004 Pattern codification strategies in structured light systems Pattern
Recognition 37 827-849
[2] Jens Gühring 2001 Dense 3-D surface acquisition by structured light using off-the-shelf
components Proceedings of SPIE - The International Society for Optical Engineering 4309
220-231
[3] Sukhan Lee 2004 An active 3D robot camera for home environment Proceedings of The IEEE
Sensors 1 477-480
[4] Nelson l 2004 Creating interactive 3-D media with projector-camera systems Proceedings of
The SPIE-The International Society for Optical Engineering 5308 850-861
[5] Zhang Guangjun 2003 Elliptical-center locating of light stripe and its simulation for structured
light based 3D vision inspection Chinese Journal of Scientific Instrument 24 589-593
541

Sign up to vote on this title
UsefulNot useful