You are on page 1of 18

applied

sciences
Article
Optical 3-D Profilometry for Measuring
Semiconductor Wafer Surfaces with Extremely
Variant Reflectivities
Liang-Chia Chen 1,2, * , Duc-Hieu Duong 2 and Chin-Sheng Chen 2
1 Department of Mechanical Engineering, National Taiwan University, No. 1, Section 4, Roosevelt Rd, Da’an
District, Taipei City 10617, Taiwan
2 Graduate Institute of Automation Technology, College of Mechanical and Electrical Engineering, National
Taipei University of Technology, No. 1, Section 3, Zhong-Xiao E. Rd, Da’an District, Taipei City 10608,
Taiwan; duongduchieu85@gmail.com (D.-H.D.); saint@mail.ntut.edu.tw (C.-S.C.)
* Correspondence: lchen@ntu.edu.tw; Tel.: +886-2-23620032

Received: 2 April 2019; Accepted: 14 May 2019; Published: 19 May 2019 

Abstract: A new surface profilometry technique is proposed for profiling a wafer surface with
both diffuse and specular reflective properties. Most moiré projection scanning techniques using
triangulation principle work effectively on diffuse reflective surfaces, on which the reflected light
beams are assumed to be well captured by optical sensors. In reality, this assumption is no longer
valid when measuring a semiconductor wafer surface having both diffuse and specular reflectivities.
To resolve the above problem, the proposed technique uses a dual optical sensing configuration by
engaging two optical sensors at two different viewing angles, with one acquiring diffuse reflective
light and the other detecting at the same time specular surface light for achieving simultaneous
full-field surface profilometry. The deformed fringes measured by both sensors could be further
transformed into a 3-D profile and merged seamlessly for full-field surface reconstruction. Several
calibration targets and industrial parts were measured to evaluate the feasibility and accuracy of the
developed technique. Experimental results showed that the technique can effectively detect diffuse
and specular light with repeatability of one standard deviation below 0.3 µm on a specular surface
and 2.0 µm on a diffuse wafer surface when the vertical measuring range reaches 1.0 mm. The present
findings indicate that the proposed technique is effective for 3-D microscale surface profilometry in
in-situ semiconductor automated optical inspection (AOI).

Keywords: semiconductor; surface profilometry; moiré projection; 3-D measurement; automated


optical inspection (AOI)

1. Introduction
Measuring the 3-D profile of an object surface has become increasingly important in industrial
automation, particularly in in-situ product inspection. 3-D surface measurement could be generally
classified into two major categories: tactile and non-contact measuring conditions. Coordinate
Measuring Machine (CMM) is a general system using both contact and non-contact techniques to
measure 3-D profiles. Tactile sensing cannot accurately measure soft object or thin-film surfaces as the
probe may scratch or deform the measured surface. In contrast, non-contact techniques can effectively
measure surfaces with high spatial resolution and are useful for measuring soft or flexible objects.
For non-contact measurement, optical triangulation techniques, such as stereo vision, laser scanning,
and structured light techniques, are widely used in industries. However, optical techniques often
require uniform surface light reflection to ensure reliable sensing. The absence of such condition would
pose difficulty for using optical triangulation techniques to measure a surface with both diffuse and

Appl. Sci. 2019, 9, 2060; doi:10.3390/app9102060 www.mdpi.com/journal/applsci


Appl. Sci. 2019, 9, 2060 2 of 18

specular reflectivities. One of the significant challenges in optical surface measurement lies in dealing
with reflectivity variations of the object surfaces to be tested, such as a semiconductor wafer surface
being engraved or marked by laser etching. Measuring objects with high reflectivity variations, such
as highly shiny or scattering surfaces, is a critical issue to ensuring measurement accuracy and product
quality assurance under automated optical inspection (AOI). Poor 3-D profiling results are common
when encountering high reflectance variances on the wafer surface to be tested or reconstructed for
3-D printing or other purposes.
One of the popular 3-D optical surface measurement techniques is structured light projection,
which is widely employed in industrial testing applications. The techniques basically work according
to the triangulation measurement principle with the assumption of uniform light reflectivity on the
tested object for ensuring capture of the deformed fringe reflecting from the tested surface. However,
in reality, such assumption is not applicable in face of high reflectivity variations on the surface.
For example, most industrial IC chips are fabricated using a silicon wafer substrate with an extremely
shiny surface and are also embedded with metal bumps having a scattering surface reflectance.
Under such condition, the reflected lights from the tested surface are rather complex and emitting in
different directions. Consequently, the detecting sensor cannot receive accurate phase information for
reconstructing surface morphology.
Structured fringe projection is a full-field optical technique for measuring 3-D object surfaces.
Owing to its advantages of fast speed, high accuracy, being non-destructive, and full-field testing, fringe
projection technique (FPP) has become one of the most promising 3-D surface profiling techniques in
3-D AOI. For obtaining the object shape, the common 3-D measurement techniques include mainly
phase shifting profilometry (PSP) [1] and Fourier transform profilometry (FTP) [2]. In general, these
techniques project structured light patterns, also called fringes, onto the object surface. Owing to
optical path difference (OPD) between inspecting beams, the fringe is deformed according to the
3-D shape of the object surface. To reconstruct the 3-D shape of the object surface, one phase map is
computed using a wrapping algorithm, and object height can be extracted using the phase unwrapping
algorithm [3] and phase-to-height transformation. PSP is widely employed in industrial inspection
because of its promising accuracy and high spatial resolution [4]. It uses a time-resolved wrapping
strategy to extract phase information for reconstruction of object profiles. Different from PSP, FTP
records distorted fringe images and analyzes object height information using phase modulation and
demodulation to acquire a phase map from distorted fringes. The unique advantage of FTP is its
one-shot capability of 3-D surface imaging. However, for exacting 3-D profile reconstruction, one
of the key issues in FTP is to extract accurately the first-order spectrum from the spectral domain.
The extraction would require an adequate band-pass filter to obtain accurate spectral information
from the first-order spectrum and separate it from the background spectrum (zero-order spectrum).
Conditions for separating these spectrum regions have been investigated by considering the minimum
distance between spectrum regions [5–7]. The single elliptic band-pass filter [8] and double orthogonal
elliptic band-pass filter [9,10] have been developed to extract accurately the first-order spectrum from
the spectral domain. Nevertheless, FTP still lacks a robust algorithm for detecting the first-order
spectrum to reconstruct accurately 3-D profiles. Apart from these methods, scanning white light
interferometry (SWLI) is powerful for its phase unambiguity and nano-scale depth resolution [11].
It has been widely applied for precise sub-micro-scale and nano-scale surface profilometry, especially in
semiconductor microstructure metrology. The method is mainly used for review or off-line metrology
since its scanning efficiency is limited by vertical scanning required.
A major challenge for 3-D scanning techniques using FPP is the presence of specular reflectance
on the tested object surface. Specular highlights occur on object surfaces when the specular element
of reflection is dominant. Specular highlight can easily saturate the detected image and seriously
deteriorate image quality. In the literature, several techniques have been proposed for resolving the
problem when measuring specular targets. All these techniques can effectively reduce specular effects
to different extents but cannot resolve the problem completely. One of the proposed techniques was
Appl. Sci. 2019, 9, 2060 3 of 18

to separate specular components from diffuse components and remove the specular components
using polarizers [12–15]. Tan and Ikeuchi relied on image color properties when transferring image
color to the hue domain [16]. This technique effectively eliminated image-saturated regions mainly
because the saturation value of every pixel was made constant for achieving maximum chromaticity
while keeping their hue. Other techniques, such as using multi-exposure time [17] or projecting
special structured light with strip edges [18] have been experimented for handling non-uniform
light reflection. Phase measuring deflectometry (PMD) is a specular surface measuring technique
using a Thin-Film-Transistor Liquid-Crystal Display (TFT LCD) screen as the pattern-projecting light
source [19,20]. The maximum detectable surface curvature is currently limited by the LCD light
dispersion angle. More recently, single-shot deflectometry using a single composite pattern was
developed for one-shot measurement of three-dimensional imaging using Fourier transform (FT) and
spatial carrier-frequency phase-shifting (SCPS) techniques [21]. However, generally for PMD, due to
demanding computation in slope integration, the absolute shape measurement may not be accurately
achieved. Thus, apart from visible moiré fringe projection, Wiedenmann et al. used high-power
infrared laser pattern to project on the measured object surface, in which the projected laser power
was partly transferred into dispersive heat [22]. Since heat dispersion can be generated from a highly
specular surface, extremely dark or transparent materials can be observed clearly by a proper heat
sensor [23]. Phase shifting technique can be employed to measure 3-D profile according to distortion
of the heat pattern on the object surface. The current limitation of the method is its low measuring
accuracy, which still requires further improvement. Up to date, some light separation strategies using
light property analysis were developed, including separation specular reflection using color analysis [3],
examining chromaticity and intensity value of specular and diffuse pixels [16], employing polarization
to separate reflection components [24] or using environmental structured illumination [25] techniques.
Although these methods have been effective to some extents in separating diffuse from specular light,
they basically assume either uniform surface reflectance or specific surface light reflectance conditions,
which may not be applicable for many in-situ testing applications.
Many semiconductor parts with variant reflectivities need to be inspected in in-situ manufacturing
processes for strict dimensional compliance. These parts often comprise a body (such as epoxy and
silicon), metal bumps, and pads (such as lead and solder), each of which may have complicated
geometric shapes with different orientations with respect to the detecting optical sensor [26]. Moreover,
the most challenging task is to overcome extremely variant wafer surface reflectivities, ranging widely
from diffuse to specular conditions. In 3-D inspection, the 3-D surface profilometric technique of the
tested objects often projects a structured light pattern with a desirable assumption that a non-absorbent
and perfect diffusely scattered model of the surface under test (SUT) is observed. However, in
reality, this assumption may not be true for many actual cases. In optical triangulation detection,
viewing the measured surface from a sensing angle can pose a huge measuring uncertainty in ensuring
detection of reflecting light due to variant surface reflectance properties. The extent of reflecting light
detected by imaging sensors can significantly affect the success of surface measurement. Most existing
techniques discussed above can resolve the light specular problem only for a surface with uniform
reflectivity. For objects with non-uniform surface reflectivity, the captured image contrast becomes
poor or even non-detectable.
Therefore, it is of need to develop an effective technique that can overcome the complex reflectivity
problem encountered in optical surface profilometry, especially for in-situ semiconductor wafer surface
inspection. The article proposes a dual sensing strategy to overcome the 3-D surface profilometric
problem encountered in semiconductor wafer surface. The rest of this paper is organized as follows.
Section 2 presents the proposed method for 3-D profilometry using dual sensing. The experimental
design and measured results are shown and analyzed in Section 3. Finally, the conclusions are
summarized in Section 4.
Appl. Sci. 2019, 9, x FOR PEER REVIEW 4 of 19

2. Developed
Appl. Sci. 2019,
Sci. 9, 3-D
2019, 9, FORProfilometry
x2060 PEER REVIEW Using Dual Sensing 44 of
of 19
18

In structured fringe projection, the angle between the projection and the sensing detection plays
2. Developed 3-D Profilometry Using Dual Sensing
an
2. Developedrole
important 3-DinProfilometry
determiningUsing the reflecting direction. Figure 1a shows the fringe image captured
Dual Sensing
In structured fringe projection, the angle
from a semiconductor wafer surface detected using a laser-etched between the projection and the grove
microscale sensing detection
with plays
the viewing
In
an important structured
roleAs fringe
incan projection,
determining the angle
thegroove
reflecting between the projection and the sensing detection plays
angle set at zero. be seen, the areadirection. Figure
is rather clear but1athe
shows the fringe
remaining waferimage captured
specular area
an
from important
a role
semiconductor in determining
wafer the
surface reflecting
detected direction.
using a Figure
laser-etched 1a shows
microscale the fringe
grove image
with the captured
viewing
is almost non-detectable. Therefore, the groove area can be seen with an adequate exposure time
from aset
angle semiconductor
at zero.light
As can wafer surface
be seen, the detected
groove using a laser-etched but microscale grove withspecular
the viewing
because diffuse is generally weak. Witharea is rather
larger viewing clear
angle, the
the remaining
wafer surface wafer
becomes clearer area
angle
is almost set at zero. As
non-detectable.can be seen, the
Therefore, groove area
the groove is rather clear
area can beshown but the
seen with remaining wafer specular area is
with a better image contrast (see Figure 1b,c). Illustration fromanFigure
adequate 1b–dexposure
represents timea
almost
because non-detectable.
diffuse light is Therefore,
generally the
weak. groove
With area
largercan be seen
viewing with
angle, an
the adequate
wafer exposure
surface time
becomes because
clearer
transition in imaging with increase in viewing angle. When the reflectance relation follows the light
diffuse
with light isimage
a better generally weak.(see
contrast With larger1b,c).
Figure viewing angle, the
Illustration waferfrom
shown surface becomes
Figure 1b–dclearer
representswith a
reflectance rule, the fringe contrast on the wafer surface reaches its maximum as illustrated in Figurea
better
transition image contrast
in represents
imaging with (see Figure
increase1b,c).
intheIllustration
viewing shown from Figure 1b–d represents a transition in
1e. Figure 1f the case when sensorangle.
becomes Whenoverthe reflectance
saturated relation
owing followsexposure
to excessive the light
imaging
reflectance with increase in viewing angle. When the reflectance relation follows the light reflectance
time. Figurerule, the fringe the
1 illustrates contrast on reflectance
critical the wafer surface reaches
condition of theits maximum
tested surface as illustrated in Figure
under inspection.
rule, the
1e. Figure fringe contrast
1f represents on the
the case wafer surface
when the reaches
sensor its maximum as illustrated in Figure 1e. Figure 1f
Figure 2 shows the general reflection model of becomes
diffuse and over saturated
specular owing to excessive
components when the exposure
light is
represents
time. Figure the 1case when
illustrates the
thesensor becomes
critical over
reflectance saturated
condition owing
of theto excessive
tested exposure
surface under time. Figure 1
inspection.
incident from an angle.
illustrates
Figure the critical
2 shows reflectance
the general condition
reflection modelof the tested surface
of diffuse under components
and specular inspection. Figure
when 2the shows
lightthe
is
general reflection
incident from an angle. model of diffuse and specular components when the light is incident from an angle.

(a) (b) (c)

(a) (b) (c)

(d) (e) (f)

(d) (e) (f)


Figure
Figure 1.
1.Images
Imagesofofwafer
wafersurface
surfaceetched
etchedwith
withaalaser-etched
laser-etchedgroove
grooveand
andcaptured
capturedat atdifferent
different viewing
viewing
angles:
angles: (a) viewing angle ± 1 , exposure time 420 ms; (b) viewing angle 12 ± 1 , exposuretime
(a) viewing angle ± 1°,
◦ exposure time 420 ms; (b) viewing angle 12 ± 1°,
◦ exposure time200
200ms;
ms;
Figure
(c) 1.
viewingImages
angle of
25 wafer
± 1°,
◦ surface
exposure etched
time with
80 a laser-etched
ms;(d) viewing groove
angle 45 and
± 1°, captured
exposure
◦ at different
time
(c) viewing angle 25 ± 1 , exposure time 80 ms; (d) viewing angle 45 ± 1 , exposure time 20 ms; and 20 ms;viewing
and (e)
angles:
viewing (a) viewing
angle
(e) viewing 45 ±45
angle angle
1°, 1◦ ,±exposure
±exposure1°, exposure50 time
timetime ms. 420 ms; (b) viewing angle 12 ± 1°, exposure time 200 ms;
50 ms.
(c) viewing angle 25 ± 1°, exposure time 80 ms;(d) viewing angle 45 ± 1°, exposure time 20 ms; and (e)
viewing angle 45 ± 1°, exposure time 50 ms.

Figure
Figure 2.
2. Reflection
Reflectionmodel
modelof
ofdiffuse
diffuse and
and specular
specular components.
components.

Figure 2. Reflection model of diffuse and specular components.


Appl. Sci. 2019, 9, x FOR PEER REVIEW 5 of 19
Appl. Sci. 2019, 9, 2060 5 of 18
To resolve the above issue, this article presents a new optical measuring configuration to capture
both of specular and diffuse light reflecting from wafer surfaces. A dual sensing strategy is proposed
To resolve the above issue, this article presents a new optical measuring configuration to capture
to detect the light reflection from wafer surfaces when structured fringe projection is used to measure
both of specular and diffuse light reflecting from wafer surfaces. A dual sensing strategy is proposed
and reconstruct 3-D surfaces. The proposed system design and measuring principle are detailed in
to detect the light reflection from wafer surfaces when structured fringe projection is used to measure
the following sessions.
and reconstruct 3-D surfaces. The proposed system design and measuring principle are detailed in the
following sessions.
2.1. Optical System Setup
2.1. Optical
Figure 3System Setupthe optical system configuration of the developed 3-D profilometry. As can
illustrates
be seen, it includes
Figure a low-coherent
3 illustrates the opticallightsystemsource (1), a diffuser
configuration of the(2),developed
which is collimated by a lensAs
3-D profilometry. module
can be
(3),
seen, it includes a low-coherent light source (1), a diffuser (2), which is collimated by a lens module To
a sinusoidal grating (4), and a linear translator (5) for generating an accurate shifted phase. (3),
achieve full-field
a sinusoidal surface
grating profilometry
(4), and using PSP,
a linear translator (5) the light sourceanisaccurate
for generating a low-coherent
shifted light
phase. source such
To achieve
as LED (light-emitting
full-field diode), instead
surface profilometry of a the
using PSP, coherent one like
light source islasers. In the system,
a low-coherent light the structured
source such aslight
LED
generated by the incident optical module passes through a telecentric lens (6)
(light-emitting diode), instead of a coherent one like lasers. In the system, the structured light generated and is projected onto
the measured
by the incidentsurface
optical(8) defined
module by the
passes reference
through plane (7).
a telecentric lensThe
(6)reflected light passes
and is projected through
onto the two
measured
objectives (12 and 13) and is captured by two individual optical sensing cameras
surface (8) defined by the reference plane (7). The reflected light passes through two objectives (12 and (10 and 11). The first
objective
13) and is(12) is arranged
captured by two in individual
vertical direction
optical to capture
sensing the diffuse
cameras light
(10 and while
11). Thethefirstsecond objective
objective (12) is
isarranged
set to a reflection
in vertical direction to capture the diffuse light while the second objective is set to a light
angle with respect to the incident projection light angle, so that specular can
reflection
be wellwith
angle captured.
respect With such
to the an optical
incident configuration,
projection bothso
light angle, diffuse and specular
that specular light light
can be beams reflected
well captured.
can be well detected by their corresponding objectives. When the projected
With such an optical configuration, both diffuse and specular light beams reflected can be well detectedsinusoidal fringe (9) hits
the measured wafer surface, it becomes a deformed fringe according
by their corresponding objectives. When the projected sinusoidal fringe (9) hits the measured wafer to the phase difference
generated
surface, it by the optical
becomes path difference
a deformed (OPD) between
fringe according to the phasethe difference
object profile and the
generated byreference
the opticalplane.
path
The deformed fringe image is transferred to a computer (14) and the
difference (OPD) between the object profile and the reference plane. The deformed fringe image is phase map is then computed
with multi-phase
transferred shifting (14)
to a computer and and the the
wrapping
phase map principle
is thenusing
computeda set with
of deformed
multi-phase fringe images.
shifting and
Subsequently, phase unwrapping and phase-to-height transformation
the wrapping principle using a set of deformed fringe images. Subsequently, phase unwrapping and are performed to extract the
profile of the measured surface.
phase-to-height transformation are performed to extract the profile of the measured surface.

Figure
Figure3.3.Configuration
Configurationof
ofthe
thedeveloped
developeddual-sensing
dual-sensingmeasurement
measurementsystem.
system.

Allthe
All theoptical
opticallenses
lensesemployed
employedin inthe
thedeveloped
developedsystem
systemarearetelecentric
telecentricto toensure
ensurethe
theformation
formation
of aa constant
of constant image
image size
size along
along the
the depth
depth of
of field (DOF)
(DOF) of the measurement. When When thethe measured
measured
surface located
surface located inside the overlapped with
with the
the DOF
DOFofofthe
thetwo
twocameras
cameras(indicated
(indicatedasasthe
thegray zone
gray zonein
Figure 3), the captured image size can be kept constant, so the FOV is kept as a rectangular
in Figure 3), the captured image size can be kept constant, so the FOV is kept as a rectangular or or square
shape according
square to the image
shape according to the sensor’s design. design.
image sensor’s Meanwhile, to keepto
Meanwhile, thekeep
focustheplane
focusofplane
the imaging
of the
Appl. Sci. 2019, 9, 2060 6 of 18

sensor to locate at the same focus plane with the projecting light side, the image sensor can titled
with the principle optical axis by an adequate angle to compensate the tilting effect. Thus, both of the
focus planes (light projection and imaging) are kept at the same focus plane and the image shape are
rectangular or square in the same FOV. Meanwhile, an adequate camera calibration can be performed
to identify all the intrinsic and extrinsic parameters of the coordinate transform, so 3D point cloud can
be transformed with image coordinates accurately.

2.2. Measuring Principle

2.2.1. Fundamentals in PSP


Using PSP technique for 3-D shape measurement involves several procedures, including projecting
sinusoidal fringe patterns onto the object surface, taking the images of the reflected distorted pattern
on the object surface, and then calculating the phase map from these captured images. Height
information is extracted from the phase map by phase unwrapping and phase-to-height transformation.
The calibration coefficients are then employed to transform the 3-D coordinates in image coordinates
into world coordinates (x, y, z) of each measured point. The phase φ(x, y) can be computed using
light intensities of captured images set by phase-shifted fringe projection. The five-step phase-shifting
technique is commonly used because of its accuracy and efficiency. The intensities of the five channels
in each pattern can be modeled as follows:

Ii (x, y) = Ib (x, y) + Im (x, y) cos[φ(x, y) + δi ](i = 1∼5) (1)

where φ(x, y) is the phase value used by the phase-shifting algorithm in (x, y) location, δi is the
phase-shift, Ii (x, y) denotes the intensities of the five captured images for each pixel position, Im (x, y)
and Ib (x, y) are the intensity modulation and the average intensity for each pixel, respectively.
The phase value of each pixel φ(x, y) can be resolved using Equation (1) as:
 N 
I (x, y) sin(δi ) 
 P 
 i=1 i


φ(x, y) = − tan−1   (2)
N

Ii (x, y) cos(δi ) 
 P 

i=1

where δi , i = 0, 1, 2, . . . N, are the phase-shift values of each channel.


For the five-step phase-shifting technique (N = 5) and δ1 = 0◦ , δ2 = 90◦ , δ3 = 180◦ , δ4 =
270 , δ5 = 360◦ , the phase value φ(x, y) can be computed using the intensity of each channel and the

corresponding phase-shift δi from the five-step phase-shifting technique shown in Equation (2).
With the phase value φ(x, y) being a function of arctangent, the phase value can be obtained in a
range from −π to π for phase wrapping. Consequently, if the pixel-based phase difference is larger than
2π, the wrapping phase becomes discontinuous and ambiguous. To obtain a continuous phase map, a
phase unwrapping process for subtracting or adding the multiple values of 2π is needed to reconstruct
the non-ambiguous phase map. The Gold-Stein phase unwrapping algorithm was employed to extract
the unwrapped phase map [3]. Then, the unwrapped phase difference can be transferred to a height
map using the following phase to height transformation.
The phase-to-height transformation is performed using the triangulation technique. The height
information of object surface h(x, y) of each pixel is carried by its phase difference value ∆φ(x, y),
which is measured by the phase difference between the measured phase on the object surface and the
reference plane. Since the measured height is generally much smaller than the projection height of the
optical system and the projecting objective is telecentric, the object height has a linear relation with the
unwrapped phase difference as shown in Equation (3). Thus, the phase value can be transformed by a
linear transformation coefficient K, and the height information of the object is then given as:
Appl. Sci. 2019, 9, 2060 7 of 18

h(x, y) = K∆φ(x, y) (3)

2.2.2. Dual-Sensing Technique


As mentioned above, when the projected light beam hits the tested wafer surface, the reflected
light beams may contain both diffuse and specular components (see Figure 2). The diffuse components
are mainly formed by reflected light beams from micro structures on wafer surface with high surface
roughness. Each reflected beam may interfere with other beams. However, there is no light interference
due to low light coherence. In the optical design, the diffuse light intensity is reflected at a wide
reflecting angle and can be detected by a vertical sensing camera. On the other hand, specular light
is generated by the reflection of incident light from an object with a shiny reflective surface on a flat
wafer surface having very low surface roughness. According to the law of reflection, specular light can
be detected by an optical objective located at a reflectance angle equal to that of the incident light.
To capture both diffuse and specular components simultaneously, a dual-sensing optical system
is developed. In the configuration, a camera (10) is dedicated to capturing diffuse components with
another camera (11) designated for capturing specular components. To reconstruct the 3-D shape
of the measured surface, the reflected fringe images representing both diffuse and specular beams
are simultaneously captured using the proposed system, illustrated in Figure 3. By controlling the
adequate duration of light exposure required by the individual cameras, two deformed fringe images
with high contrast can be detected. The entire 3-D shape of the measured target surface can be
effectively synthesized from the two individual images detected and reconstructed by accurate pixel
spatial mapping between two sensors. Details of the proposed measuring system are illustrated by the
flowchart
Appl. shown
Sci. 2019, in PEER
9, x FOR Figure 4.
REVIEW 8 of 19

Flowchart of the proposed measuring technique.


Figure 4. Flowchart

In the
In the proposed
proposed technique,
technique, the
the 3-D
3-D shape
shape region
region detected
detected and
and established
established from
from diffuse
diffuse light
light can
can
be well synthesized with the 3-D shape region detected and established from specular light. The
be well synthesized with the 3-D shape region detected and established from specular light. The two two
surface regions
surface regions detected
detectedcan
canbe
befurther
furthermerged
mergedseamlessly
seamlesslyusing
using a camera
a camera calibration
calibration procedure.
procedure. A
special artifact target is designed and employed as a special mapping target. Two cameras can be
calibrated and merged into the same measuring field of view (FOV) on the tested surface. Details of
the proposed calibration procedure are illustrated by the flowchart shown in Figure 5.
Appl. Sci. 2019, 9, 2060 8 of 18

A special artifact target is designed and employed as a special mapping target. Two cameras can be
calibrated and merged into the same measuring field of view (FOV) on the tested surface. Details of
Appl. Sci. 2019, 9, calibration
the proposed x FOR PEER REVIEW
procedure are illustrated by the flowchart shown in Figure 5. 9 of 19

Figure 5. Flowchart of the proposed calibration procedure.


Figure 5. Flowchart of the proposed calibration procedure.
2.2.3. Dual Camera Mapping
2.2.3. Dual Camera Mapping
To achieve precise mapping and calibration, an automatic and accurate mapping algorithm was
To
developedachieve precise
to align mapping
the two camerasand calibration,
into the same FOV an automatic
on the measured and accurate object.mapping
Figure 6algorithm
illustrateswas
the
developed to align the two cameras into the same FOV on
proposed idea of dual camera mapping using random speckles. When mapping images captured the measured object. Figure 6 illustrates
by
the
the proposed ideaaofunique
two cameras, dual camera
specklemapping
pattern using is usedrandom speckles. When mapping
and a cross-correlation algorithm images captured
is employed to
by the two cameras, a unique speckle pattern is used and a cross-correlation
compute the corresponding coefficient. In this study, the following normalized cross-correlation (NCC) algorithm is employed
to computeisthe
algorithm corresponding
employed to computecoefficient. In this study,
the correlation between theimages
following capturednormalized cross-correlation
by the two cameras.
(NCC) algorithm is employed to compute the correlation between images captured by the two
0 0 0 0 0
P
cameras. x0 y0 (T (x , y ).I (x + x , y + y ))
R(x, y) = q (4)

2
x (, yx ′,) y. ′).xI0 ,y
x0 ,yx0′ yT′ ((T
0 0 2P
′(0 xI +(xx+′, xy ,+y y+′))y )
0 0 0
P
R ( x, y ) = (4)
 
2 2
where: x ′, y ′
T ( x ′, y ′) . x ′1
I
, y′ X
′( x + x ′, y + y ′)
I0 (x + x0 , y + y0 ) = I (x + x0 , y + y0 ) − . I0 (x + x00 , y + y00 )
where: w.h
x00 ,y00

1
I ′( x + x′, y + y′) = I ( x + x′, y + y′) − . x ′′, y ′′ I ′( x + x′′, y + y′′)
w.h
Appl. Sci. 2019, 9, 2060 9 of 18
Appl.
Appl. Sci.
Sci. 2019,
2019, 9,
9, xx FOR
FOR PEER
PEER REVIEW
REVIEW 10
10 of
of 19
19

Figure
Figure 6.
Figure 6. Proposed idea
6. Proposed
Proposed idea of
idea of dual
of dual camera
dual camera mapping
camera mapping using
mapping using random
using random speckles.
random speckles.
speckles.

The
The correlation
The correlation value
correlation value ranges
value ranges
ranges from from 0.0
from 0.00.0 to
to 1.0.
to 1.0. The
1.0. The matching
The matching score
matching score
score ofof 1.0
1.0 means
of 1.0 means 100%
100% matching
means 100% matching
matching
between
between sub-image
between sub-image
sub-image and and template
and template image.
template image.
image. To To ensure
To ensure that
ensure that the
that the dual
the dual camera
dual camera mapping
camera mapping technique
mapping technique worked
technique worked
worked
well,
well, a correlation
well, aa correlation threshold
correlation threshold
threshold is is needed
is needed
needed to to estimate
to estimate the
estimate the matching
the matching score.
matching score. It
score. It is
It is also
is also important
also important
important to to note
to note that
note that
that
the
the sub-image
the sub-image
sub-image sizesize
size isis set
is set to
set to be
to be large
be large enough
large enough
enough to to retain
to retain the
retain the accuracy
the accuracy
accuracy ofof the
of the matching
the matching process.
matching process. When
process. When the
When the
the
sub-image
sub-image size
sub-image size is
size is too
is too small,
too small, the
small, the robustness
the robustness
robustness of of the
of the correlation
the correlation technique
correlation technique cannot
technique cannot
cannot be be assured.
be assured.
assured. InIn practice,
In practice,
practice,
the
the sub-image
the sub-imagesize
sub-image sizeis is
size chosen
is chosen
chosen as aas quarter
as of theof
aa quarter
quarter captured
of the FOV to FOV
the captured
captured achieve
FOV to good
to achieve
achieve matching
good and
good reasonable
matching
matching and
and
computation
reasonable efficiency.
reasonable computation To achieve
computation efficiency.
efficiency. To precise image
To achieve mapping
achieve precise
precise imagebetween
image mappingthe two
mapping between images,
between the a random
the two speckle
two images,
images, aa
pattern
randomcan
random be prepared
speckle
speckle pattern and
pattern can employed
can be
be prepared
prepared by and
projecting
and employed
employeduniform
by blue light on
by projecting
projecting an electrostatic-coating
uniform
uniform blue
blue light
light on
on an
an
aluminum surface with
electrostatic-coating
electrostatic-coating micro random
aluminum
aluminum surface patterns
surface with
with micro on random
micro it. The unique
random patterns
patternsandonrandom
on it.
it. The speckleand
The unique
unique pattern
and can
random
random
provide
speckle accurate
speckle pattern
pattern can image
can provideregistration
provide accurate for image
accurate image mapping.
image registration Figure
registration for 7 shows
for image
image mapping.the aluminum
mapping. Figure surface
Figure 77 shows target
shows the
the
designed
aluminum and
aluminum surface one of
surface targetthe speckle
target designed
designed andimages prepared
and one
one ofof the for image
the speckle registration.
speckle images
images prepared
prepared for for image
image registration.
registration.

(a)
(a) (b)
(b)
Figure
Figure 7. Calibration
7. Calibration target
Calibrationtarget design
targetdesign forfor
design for dual
dual
dual camera
camera
camera image
image
image mapping:
mapping:
mapping: (a)
(a) white
(a) whitewhite painted
painted
painted aluminum
aluminum
aluminum target;
target;
target;
(b) (b)
(b) captured
captured specklespeckle
captured speckle image using
imagethe
image using using the
the vertical
vertical camera.camera.
vertical camera.

Figure
Figure 88 shows
shows the
the flowchart of the
flowchart of the proposed
proposed dual
dual camera
camera mapping
mapping technique.
technique. OneOne sub-image
sub-image
(a quarter of captured image) of the vertical camera is cropped in the image center and selected as a
template image.
template image. The matching score
The matching
matching score between
between thethe tilted
tilted image
image and
and template
template image
image (vertical
(vertical camera)
camera)
is determined using
is determined the NCC
using the NCC algorithm.
algorithm. TheThe shifted
The shifted value
shifted value between
between the template image
the template image and
and that
that
captured by the tilted camera can be calculated
captured by the tilted camera can be calculated along along the
along the X and Y directions
the X and Y directions of image
directions of image coordinates.
image coordinates.
coordinates.
This
This parameter
parameter can
can accurately
accurately represent
represent the
the spatial
spatial location difference between
location difference between the
the FOV
FOV ofof the
the tilted
tilted
and
and the
the vertical cameras. This
vertical cameras. This process
process is
is repeated
repeated until
until the
the parameter
parameter converged
converged toto aa preset
preset threshold
threshold
for
for completion.
completion.
Appl. Sci. 2019, 9, 2060 10 of 18
Appl. Sci. 2019, 9, x FOR PEER REVIEW 11 of 19

8. Flowchart of the proposed dual camera mapping procedure.


Figure 8. procedure.

2.2.4. Segmentation of Specular and Diffuse Images


2.2.4. Segmentation of Specular and Diffuse Images
An image averaging algorithm is developed to extract specular and diffuse measuring regions
An image averaging algorithm is developed to extract specular and diffuse measuring regions
with high S/N ratio from the acquired deformed fringe images. Removal of the projected fringe from
with high S/N ratio from the acquired deformed fringe images. Removal of the projected fringe from
the acquired fringe is essential for segmentation because these deformed fringes contain light intensity
the acquired fringe is essential for segmentation because these deformed fringes contain light
modulation, thus making segmentation difficult. In phase shifting, several phase-shifted sinusoidal
intensity modulation, thus making segmentation difficult. In phase shifting, several phase-shifted
patterns are projected onto the tested object and their corresponding deformed fringe images are
sinusoidal patterns are projected onto the tested object and their corresponding deformed fringe
acquired sequentially. A background image is defined here as an illuminated object image without
images are acquired sequentially. A background image is defined here as an illuminated object image
structured fringes. To obtain the background image of these deformed fringe images, the image
without structured fringes. To obtain the background image of these deformed fringe images, the
averaging algorithm can be performed by simply summing all the phase-shifted images to remove the
image averaging algorithm can be performed by simply summing all the phase-shifted images to
fringe modulation and generate the background image.
remove the fringe modulation and generate the background image.
It is important to note that the above algorithm can reliably work even when the reflectivity
It is important to note that the above algorithm can reliably work even when the reflectivity of
of the tested surface is extremely variant. The algorithm operates on every individual pixel for the
the tested surface is extremely variant. The algorithm operates on every individual pixel for the
summing; hence, the result is generally insensitive to the pixel intensity. In an experimental test on
summing; hence, the result is generally insensitive to the pixel intensity. In an experimental test on
the algorithm, the sinusoidal projection grating was made by a precise lithography process and a
the algorithm, the sinusoidal projection grating was made by a precise lithography process and a
semiconductor’s wafer surface having laser-etched groves was tested. A precise linear stage with
semiconductor’s wafer surface having laser-etched groves was tested. A precise linear stage with
positioning repeatability less than 1 µm was adopted to perform four-step phase shifting. Figure 9a–d
positioning repeatability less than 1 µm was adopted to perform four-step phase shifting. Figure 9a–
illustrates the captured images from the four-step fringe projection with phase-shift 0◦ , 90◦ , 180◦ , and
d illustrates the captured images from the four-step fringe projection with phase-shift 0°, 90°, 180°,
270◦ , respectively. As can be seen, all the projected fringes are effectively removed with the image
and 270°, respectively. As can be seen, all the projected fringes are effectively removed with the image
averaging algorithm, and the illustrated object image (also called the background image) is extracted.
averaging algorithm, and the illustrated object image (also called the background image) is extracted.
The experiment result shown in Figure 9 was acquired by the tilted camera for specular light detection
The experiment result shown in Figure 9 was acquired by the tilted camera for specular light
on a smooth wafer surface. An excellent image contrast between specular and diffuse measured regions
detection on a smooth wafer surface. An excellent image contrast between specular and diffuse
can be achieved in the background image. The same can be performed on the diffuse images to achieve
measured regions can be achieved in the background image. The same can be performed on the
equal effectiveness.
diffuse images to achieve equal effectiveness.
Appl. Sci. 2019, 9, x FOR PEER REVIEW 12 of 19

Appl. Sci. 2019, 9, 2060 11 of 18


Appl. Sci. 2019, 9, x FOR PEER REVIEW 12 of 19

(a) (b) (c)

(a) (b) (c)

(d) (e)

(d) (e)
Figure 9. Phase-shifted images acquired from four-step phase shifting operation and image summing
Figure 9. Phase-shifted
algorithm: images
(a) Phase-shift 0°; (b)acquired from 90°;
Phase-shift four-step phase shifting
(c) Phase-shift 180°;operation and image
(d) Phase-shift 270°;summing
and (e)
algorithm: (a) Phase-shift 0 ◦ ; (b) Phase-shift 90◦ ; (c) Phase-shift 180◦ ; (d) Phase-shift 270◦ ; and (e) Image
Image
Figure result of averageimages
9. Phase-shifted 4 channels.
acquired from four-step phase shifting operation and image summing
result of average 4 channels.
algorithm: (a) Phase-shift 0°; (b) Phase-shift 90°; (c) Phase-shift 180°; (d) Phase-shift 270°; and (e)
3. Experimental Design
Image result of
and
average
Result Analyses
4 channels.
3. Experimental Design and Result Analyses
To demonstrate the feasibility and measurement performance of the proposed technique, a 3-D
To demonstrate
3. Experimental the and
Design feasibility
Result and measurement performance of the proposed technique, a 3-D
measurement system consisting of Analyses
two sensing cameras, a single-frequency transmission grating
measurement system consisting of two sensing cameras, a single-frequency transmission grating
with To
a period of 24
demonstrate line pairs
the pairs per
feasibilitymillimeter, and a step
and measurement motor withof
performance positioning
the proposed accuracy less than
technique, 1
with a period of 24 line per millimeter, and a step motor with positioning accuracy lessathan
3-D
µm was developed.
measurement system Two telecentric
consisting of lenses with the
twolenses
sensing same optical
cameras, specification are
a single-frequency integrated in
transmission front
grating
1 µm was developed. Two telecentric with the same optical specification are integrated in
of the
with two cameras as the optical objective to collimate the reflected diffuse and specular light from1
front aofperiod
the twoof cameras
24 line pairs per
as the millimeter,
optical andto
objective a step motorthe
collimate with positioning
reflected diffuse accuracy less than
and specular light
the
µm measured object Two
surface. The light module was
the powered by aspecification
30-Watt white arelight LED, ininwhich
fromwasthedeveloped.
measured object telecentric
surface. The lenses with
light module same
was optical
powered by a 30-Watt integrated
white light LED, front
in
the
of LED
the two is mounted
cameras as on
the a specially
optical designed
objective to mechanical
collimate the radiator
reflected for good
diffuse and heat dissipation.
specular light fromA
which the LED is mounted on a specially designed mechanical radiator for good heat dissipation.
SolidWorks
the design
measured object and hardware of the housing and optical system were implemented and shown
A SolidWorks designsurface. The light
and hardware module
of the wasand
housing powered
opticalbysystem
a 30-Watt
werewhite light LED,
implemented andin shown
which
in
theFigure
LED 10a,b,
is respectively.
mounted on a The measuring
specially FOVmechanical
designed can reach 1.0 × 1.0 mm
radiator for for
goodeveryheatmeasurement.
dissipation. A
A
in Figure 10a,b, respectively. The measuring FOV can reach 1.0 × 1.0 mm for every measurement.
maximum
SolidWorks measuring depth of the system can reach 1.0 mm which is sufficient for most
A maximum design anddepth
measuring hardware
of theofsystem
the housing
can reach and1.0optical
mm whichsystem were implemented
is sufficient and shown
for most microstructure
microstructure
in Figure 10a,b,profile inspection.
respectively. The measuring FOV can reach 1.0 × 1.0 mm for every measurement. A
profile inspection.
maximum measuring depth of the system can reach 1.0 mm which is sufficient for most
microstructure profile inspection.

(a) (b)
Figure
Figure 10. Developed
Developed system:
system: (a)
(a)SolidWorks
SolidWorks design;
design; (b)
(b) system
system hardware.
hardware.
(a) (b)
Figure 10. Developed system: (a) SolidWorks design; (b) system hardware.
Appl. Sci. 2019, 9, 2060 12 of 18

Appl. Sci.11a,b
Figure 2019, 9,shows
x FOR PEER
theREVIEW
fringe images of the tested wafer surface captured using the13tilted of 19 and

vertical cameras, respectively. Figure 11c,e illustrates the phase map and one height cross-section
Figure 11a,b shows the fringe images of the tested wafer surface captured using the tilted and
of the vertical
specular waferrespectively.
cameras, surface measured using
(g)Figure 11c,e the tilted
illustrates camera.
the phase mapSimilarly, the phase
and one height map and
cross-section of one
height the
cross-section for the laser-etched groove surface regions are measured using
specular wafer surface measured using the tilted camera. Similarly, the phase map and one height the vertical camera
simultaneously
cross-section for the laser-etched groove surface regions are measured using the vertical camera well
and shown in Figure 11d,f. The 3-D profile of groove surface region can be
simultaneously
measured and reconstructedand shown (seeinFigure
Figure11h).
11d,f.ItThe 3-D profiletoofnote
is important groovethatsurface
the 3-D region can beshape
measured well area
measured
of the groove and reconstructed
region has random(see Figure
noises due11h). It isnon-detectable
to the important to note that the
diffuse 3-Dfrom
light measured shape
the laser groove
area of the groove region has random noises due to the non-detectable diffuse
surface region. Nevertheless, this region can be totally recovered and reconstructed by another vertical light from the laser
groove surface region. Nevertheless, this region can be totally recovered and reconstructed by
camera which can capture the diffuse light from the groove region. Two detected 3-D profiles, shown
another vertical camera which can capture the diffuse light from the groove region. Two detected 3-
in Figure 11g,h, are extracted from their respective original images by the image summing algorithm
D profiles, shown in Figure 11g,h, are extracted from their respective original images by the image
and further
summing mappedalgorithmby the
andcamera
further calibration
mapped by thetechnique precisely technique
camera calibration to reconstruct a full-field
precisely 3-D profile
to reconstruct
of the measured
a full-field 3-Dwafer, shown
profile of thein Figure 11i.
measured These
wafer, shownresults indicate
in Figure that results
11i. These the full-field
indicate3-Dthatwafer surface
the full-
can befield
measured
3-D wafer and reconstructed
surface accurately
can be measured using the developed
and reconstructed technique.
accurately using the developed technique.

(a) (b)

(c) (d)

(e) (f)

(g) (h)

Figure 11. Cont.


Appl. Sci. 2019, 9, x FOR PEER REVIEW 14 of 19

Appl. Sci. 2019, 9, 2060 13 of 18


Appl. Sci. 2019, 9, x FOR PEER REVIEW 14 of 19

(i)
Figure 11. Measurement results of the tested wafer (i) sample shown in Figure 1: (a) captured deformed
fringe image of specular light, (b) captured deformed fringe image of diffuse light, (c) wrapped phase
Figure 11. Measurement
Measurement results
results of of the testedwafer
the wafer sample
sampleshown in in
Figure 1: (a)
1: captured deformed
Figure
map of11. (b,d) wrapped phase map oftested
(c,e) reconstructed shown
cross-sectionFigure
of (c,f)(a)reconstructed
captured deformed
cross-
fringe fringe
image image
of of specularlight,
specular light, (b)
(b) captured
captured deformed
deformedfringe imageimage
fringe of diffuseof light, (c) wrapped
diffuse light, phase
(c) wrapped
section of (d,g) reconstructed 3-D map of the smooth wafer surface, (h) reconstructed
map of (b,d) wrapped phase map of (c,e) reconstructed cross-section of (c,f) reconstructed cross-
3-D map of the
phase map
laser-etched of (b,d)
groove wrapped phase
surface, and3-D map
(i) map of (c,e)
reconstructed reconstructed
3-D shape cross-section
of the(h)whole of (c,f)
tested wafer reconstructed
surface.
section of (d,g) reconstructed of the smooth wafer surface, reconstructed 3-D map of the
cross-section of (d,g) reconstructed 3-D map of the smooth wafer surface, (h) reconstructed 3-D map of
laser-etched groove surface, and (i) reconstructed 3-D shape of the whole tested wafer surface.
the laser-etched
Figure groove
12 exhibits thesurface,
measured andheight
(i) reconstructed
cross-section3-D shape
alongofone thelateral
whole tested
section wafer
on thesurface.
tested wafer
Figureby
being captured 12 exhibits theand
the tilted measured height
vertical cross-section
cameras. along onedeviation
The standard lateral section on wafer
of the the tested waferregion
surface
Figure
being 12 exhibits
captured by thetilted
the measured
and height
vertical cross-section
cameras. The alongdeviation
standard one lateralof section
the wafer on the tested
surface region wafer
and groove area are evaluated for measurement precision. In a 30-time repeatability test, the standard
being and
captured by the tilted and vertical cameras. The standard deviation of the wafer surface
groove area are evaluated for measurement precision. In a 30-time repeatability test, the standard region
deviation in the measured data region, shown in Figure 12a,b, are 0.265 µm and 1.937 µm,
and groove areainare
deviation theevaluated
measuredfor measurement
data region, shownprecision. In a12a,b,
in Figure 30-timeare repeatability
0.265 µm andtest, theµm,
1.937 standard
respectively.
respectively.
deviation in the measured data region, shown in Figure 12a,b, are 0.265 µm and 1.937 µm, respectively.

(a) (b)
(a) (b)

(c)
Figure 12. Height cross-section of a surface cross-section on the wafer in Figure 1: (a) reconstructed
cross-section profile by capturing specular light. (c)
(b) reconstructed cross-section profile by capturing
diffuse light, and (c) mapped cross-section profile.
Figure 12.
Figure 12. Height
Height cross-section
cross-section of
of aa surface
surface cross-section
cross-section on
on the
the wafer
wafer in
in Figure
Figure 1:1: (a)
(a) reconstructed
reconstructed
The results obtained
cross-section profile
cross-section profile by show
by capturingthat the
capturing specular measurement
specular light.
light. (b) repeatability
(b) reconstructed can be achieved
reconstructed cross-section for
cross-section profile
profile by less than 2.0
by capturing
capturing
µm when
diffuse light,
diffuse measuring
light,and
and(c) a rough
(c)mapped laser-machined
mappedcross-section
cross-sectionprofile. wafer
profile. surface. With the proposed technique, the
measured average step height of the cutting groove on the wafer is 19.01 µm.
The
The results
resultsobtained
obtainedshow
showthat
thatthe measurement
the measurement repeatability cancan
repeatability be achieved for less
be achieved thanthan
for less 2.0 µm
2.0
when
µm whenmeasuring a rough
measuring laser-machined
a rough wafer surface.
laser-machined With theWith
wafer surface. proposed technique,technique,
the proposed the measuredthe
average
measured step heightstep
average of the cutting
height groove
of the on the
cutting wafer
groove onisthe
19.01
wafer
µm.is 19.01 µm.
To check the measurement accuracy, the measured result was compared with a pre-calibrated
laser confocal microscope. Figure 13 illustrates the measurement result obtained using a laser confocal
Appl. Sci. 2019, 9, x FOR PEER REVIEW 15 of 19

Appl. Sci. 2019, 9, 2060 14 of 18


To check the measurement accuracy, the measured result was compared with a pre-calibrated
laser confocal microscope. Figure 13 illustrates the measurement result obtained using a laser
confocal
microscopemicroscope
(KEYENCE(KEYENCE
VK-X1000) onVK-X1000) on the same
the same measured wafer measured waferThe
surface region. surface region.
measured The
average
measured average step height of the laser-etched groove in a 30-time measurement was 18.853
step height of the laser-etched groove in a 30-time measurement was 18.853 µm. With the developed µm.
With the developed
method, the measuredmethod, the step
average measured
heightaverage step height
on the same wafer on the same
surface waswafer
20.21 surface was 20.21
µm, in which the
µm, in which the measurement deviation in
measurement deviation in this case was 1.357 µm.this case was 1.357 µm.

(a) (b)

(c)
Figure
Figure 13.
13. Measurement
Measurement results
results of
of the
the wafer
wafer obtained
obtained using
using aa laser
laser confocal
confocal microscope
microscope (KEYENCE
(KEYENCE
VK-X1000): (a)
VK-X1000): (a)2D2Dcaptured
captured image
image of the
of the wafer,
wafer, (b) 3-D
(b) 3-D measured
measured profile
profile of wafer,
of wafer, and (c)and
one(c) one
cross-
cross-section
section of the of
3-Dthe 3-D profile.
profile.

To further
To further check
checkthethefeasibility
feasibilityofofthe developed
the developed method
method on an
on industrial
an industrialsemiconductor
semiconductor partsparts
with
with variant surface reflectivity, an IC chip was selected for measuring its surface profile. Figure the
variant surface reflectivity, an IC chip was selected for measuring its surface profile. Figure 14 shows 14
measured
shows the results
measured on an industrial
results on anIC surface with
industrial extremewith
IC surface reflectivity
extremevariance between
reflectivity the substrate
variance between
andsubstrate
the the metaland pads.
theThe
metalsubstrate
pads. Theimage detected
substrate by specular
image detectedlight is shownlight
by specular in Figure 14a in
is shown while the
Figure
diffuse image detected by diffuse light is illustrated in Figure 14b. Neither of
14a while the diffuse image detected by diffuse light is illustrated in Figure 14b. Neither of these two these two images can
cover the
images canentire
coversurface detection.
the entire surfaceWith the developed
detection. With the method,
developed both specular
method, bothand diffuseand
specular images can
diffuse
be detected
images can besimultaneously and reconstructed
detected simultaneously accurately. The
and reconstructed phase map
accurately. Theand onemap
phase surface
andcross-section
one surface
profile can beprofile
cross-section reconstructed for smooth substrate
can be reconstructed for smooth surface regions,
substrate as seen
surface in Figure
regions, 14c,e,
as seen respectively.
in Figure 14c,e,
Similarly, Figure
respectively. 14d,f represent
Similarly, respectively
Figure 14d,f represent the respectively
phase map and theone surface
phase mapcross-section profilecross-
and one surface of the
metal-pad regions. Compared with the developed method, the traditional
section profile of the metal-pad regions. Compared with the developed method, the traditional fringe fringe projection method
will find excessive
projection method noises measured
will find excessive around
noisesthe IC chip’saround
measured pads which
the IC arechip’s
mainly padsreflected
whichdiffuse light
are mainly
areas, as seen
reflected diffuse in Figure 14g. as
light areas, By seen
usinginthe developed
Figure 14g. Bytechnique,
using thethe surface measurement
developed technique, theofsurface
the IC
chip can be successfully obtained, in which both the pad and substrate
measurement of the IC chip can be successfully obtained, in which both the pad and substrate areas areas can be reconstructed
simultaneously
can be reconstructedand accurately, as shown
simultaneously andin accurately,
Figure 14j. In as other
shown words, a significant
in Figure 14j. In improvement
other words, in a
surface profilometry
significant improvement of objects with substantial
in surface profilometry variance of surface
of objects with reflectance
substantialcan be achieved
variance with
of surface
the developed
reflectance can approach.
be achieved with the developed approach.
Appl. Sci. 2019, 9, 2060 15 of 18
Appl. Sci. 2019, 9, x FOR PEER REVIEW 16 of 19

(a) (b)

(c) (d)

(e) (f)

(g) (h)

(i) (j)

Figure 14. Measurement results of industrial IC chip: (a) image captured by camera receiving specular
light; (b) image captured by vertical camera; (c) wrapped phase map of tilted camera; (d) wrapped
phase map of vertical camera; (e) cross-section on wrapped phase map of tilted camera; (f) cross-section
on wrapped phase map of vertical camera; (g) 3-D shape of IC chip measured by tilted camera; (h) 3-D
shape of IC chip measured by vertical camera; (i) measured ROI of industrial IC chip; and, (j) 3-D
shape of measured ROI in (i).
Figure 14. Measurement results of industrial IC chip: (a) image captured by camera receiving specular
light; (b) image captured by vertical camera; (c) wrapped phase map of tilted camera; (d) wrapped
phase map of vertical camera; (e) cross-section on wrapped phase map of tilted camera; (f) cross-
section on wrapped phase map of vertical camera; (g) 3-D shape of IC chip measured by tilted camera;
(h) 3-D shape of IC chip measured by vertical camera; (i) measured ROI of industrial IC chip; and, (j)
Appl. Sci. 2019, 9, 2060 16 of 18
3-D shape of measured ROI in (i).

Figure 15a,b
Figure 15a,b illustrate
illustrate the
the cross-section
cross-section of height map
of height map at
at Column
Column 400 on the
400 on the IC
IC chip
chip imaged
imaged byby
the two cameras in charge of capturing specular and diffuse light, respectively. With
the two cameras in charge of capturing specular and diffuse light, respectively. With the mapping the mapping
algorithm, the
algorithm, the merged
merged cross-section
cross-section profile
profile can
can be
be obtained,
obtained, as
as seen
seen in
in Figure
Figure 15c.
15c. The
The measurement
measurement
standard deviations of the specular and diffuse surfaces on the IC chip are 0.602 and 1.285
standard deviations of the specular and diffuse surfaces on the IC chip are 0.602 and 1.285 µm,
µm,
respectively. With the proposed technique, the measured average step height of
respectively. With the proposed technique, the measured average step height of the metal padsthe metal pads on
on the
thechip
IC IC chip is 13.2
is 13.2 µm.µm.

(a) (b)

(c)
15. Cross-section
Figure 15. Cross-section profile measured on IC chip shown in in Figure
Figure 14:
14: (a) cross-section profile
reconstructed from
reconstructed fromspecular
specularlight
light components;
components; (b)(b) cross-section
cross-section profile
profile reconstructed
reconstructed fromfrom diffuse
diffuse light
light components;
components; (c)cross-section
(c) 3-D 3-D cross-section surface
surface mapped
mapped by proposed
by proposed mappingmapping algorithm.
algorithm.

4. Conclusions
4. Conclusions
In
In this study, aa new
this study, new dual-sensing
dual-sensing measuring
measuring technique
technique forfor 3-D surface profilometry
3-D surface profilometry is developed
is developed
and
and proved
provedeffective
effectiveforfor
measuring
measuring objects withwith
objects extreme surface
extreme reflectance
surface variance.
reflectance The keyThe
variance. element
key
of the developed technique is its new optical configuration for simultaneous
element of the developed technique is its new optical configuration for simultaneous detection detection of both diffuse
of
and
bothspecular lightspecular
diffuse and reflected from
light variousfrom
reflected surface regions
various on a regions
surface tested object. Moreover,
on a tested object.an accurate
Moreover,
image segmentation
an accurate and mappingand
image segmentation algorithm
mapping is developed
algorithm to extract effectively
is developed detected
to extract imagedetected
effectively regions
and merge them seamlessly into an accurate 3-D surface map. Experimental
image regions and merge them seamlessly into an accurate 3-D surface map. Experimental results results indicate that the
proposed technique
indicate that can successfully
the proposed techniqueachieve simultaneous
can successfully measurement
achieve simultaneouswith repeatability
measurementofwith one
standard deviation below 0.3 µm on a specular surface and 2.0 µm in a total vertical
repeatability of one standard deviation below 0.3 µm on a specular surface and 2.0 µm in a total measurable range
of 1.0 mm
vertical on a diffuse
measurable surface.
range of 1.0 Specular
mm onlight detection
a diffuse is more
surface. accurate
Specular than
light diffuse light
detection detection
is more accuratein
3-D
thansurface
diffuseprofilometry, which
light detection in is3-D
mainly influenced
surface by the SNR
profilometry, which (signal-to-noise ratio) of the
is mainly influenced bydeformed
the SNR
fringe image. The developed technique can provide an effective wafer surface measuring technique for
in-situ AOI, especially in 3-D full-field surface profilometry.

Author Contributions: Conceptualization, L.-C.C.; Methodology, L.-C.C. and D.-H.D. ; Software, L.-C.C. and
D.-H.D.; Validation, L.-C.C. and D.-H.D.; Formal Analysis, L.-C.C. and D.-H.D.; Investigation, L.-C.C., D.-H.D.
and C.-S.C.; Resources, L.-C.C.; Data Curation, L.-C.C. and D.-H.D.; Writing-Original Draft Preparation, L.-C.C.
Appl. Sci. 2019, 9, 2060 17 of 18

and D.-H.D.; Writing-Review & Editing, L.-C.C.; Visualization, L.-C.C.; Supervision, L.-C.C. and C.-S.C.; Project
Administration, L.-C.C. and C.-S.C.; Funding Acquisition, L.-C.C.
Funding: This research was funded by Ministry of Science and Technology, Taiwan, under grant no. MOST
107-2218-E-002-060 and MOST 107-2218-E-002-002.
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Gåsvik, K.J. Optical Metrology, 3rd ed.; John Wiley & Sons Ltd., Ed.; Wiley Online Library: Chichester, UK,
2003; pp. 173–190.
2. Takeda, M.; Mutoh, K. Fourier transform profilometry for the automatic measurement of 3-D object shapes.
Appl. Opt. 1983, 22, 3977–3982. [CrossRef] [PubMed]
3. Ghiglia, D.C.; Pritt, M.D. Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software; John Wiley &
Sons Ltd., Ed.; Wiley Online Library: New York, NY, USA, 1998; ISBN 978-0-471-24935-1.
4. Chen, L.-C.; Nguyen, X.-L.; Ho, H.-W. High-speed 3-D machine vision employing Fourier Transform
Profilometry with digital tilting-fringe projection. In Proceedings of the 2008 IEEE Workshop on Advanced
robotics and Its Social Impacts, Taipei, Taiwan, 23–25 August 2008; pp. 1–6.
5. Gdeisat, M.A.; Abid, A.; Burton, D.R.; Lalor, M.J.; Lilley, F.; Moore, C.; Qudeisat, M. Spatial and temporal
carrier fringe pattern demodulation using the one-dimensional continuous wavelet transform: Recent
progress, challenges, and suggested developments. Opt. Lasers Eng. 2009, 47, 1348–1361. [CrossRef]
6. Gorecki, C. Interferogram analysis using a Fourier transform method for automatic 3D surface measurement.
Pure Appl. Opt. J. Eur. Opt. Soc. Part A 1992, 1, 103. [CrossRef] [PubMed]
7. Lin, J.-F.; Su, X. Two-dimensional Fourier transform profilometry for the automatic measurement of
three-dimensional object shapes. Opt. Eng. 1995, 34, 34–36.
8. Chen, L.-C.; Ho, H.-W.; Nguyen, X.-L. Fourier transform profilometry (FTP) using an innovative band-pass
filter for accurate 3-D surface reconstruction. Opt. Lasers Eng. 2010, 48, 182–190. [CrossRef]
9. Chen, L.-C.; Hai, H.H. Fourier transform profilometry employing novel orthogonal elliptic band-pass filtering
for accurate 3-D surface reconstruction. Precis. Eng. 2014, 38, 512–524. [CrossRef]
10. Chen, L.-C.; Ho, H.-W. Method for Acquiring Phase Information and System for Measuring Three Dimensional
Surface Profiles. Patent US8385629 B2, 26 February 2013.
11. Wyant, J.C. White light interferometry. Proc. SPIE 2002, 4737, 98–107.
12. Artusi, A.; Banterle, F.; Chetverikov, D. A survey of specularity removal methods. In Computer Graphics
Forum; Wiley Online Library: New York, NY, USA, 2011; Volume 30, pp. 2208–2230.
13. Kim, D.W.; Lin, S.; Hong, K.-S.; Shum, H.Y. Variational specular separation using color and polarization.
In Proceedings of the IAPR Workshop on Machine Vision Applications, Nara, Japan, 11–13 December 2002;
pp. 176–179.
14. Umeyama, S.; Godin, G. Separation of diffuse and specular components of surface reflection by use of
polarization and statistical analysis of images. IEEE Trans. Pattern Anal. Mach. Intell. 2004, 26, 639–647.
[CrossRef] [PubMed]
15. Ma, W.-C.; Hawkins, T.; Peers, P.; Chabert, C.-F.; Weiss, M.; Debevec, P. Rapid Acquisition of Specular
and Diffuse Normal Maps from Polarized Spherical Gradient Illumination. In Proceedings of the 18th
Eurographics Conference on Rendering Techniques, Grenoble, France, 25–27 June 2007; Eurographics
Association: Aire-la-Ville, Switzerland, 2007; pp. 183–194.
16. Tan, R.T.; Ikeuchi, K. Separating reflection components of textured surfaces using a single image. IEEE Trans.
Pattern Anal. Mach. Intell. 2005, 27, 178–193. [CrossRef] [PubMed]
17. Zhang, S.; Yau, S.-T. High dynamic range scanning technique. In Proceedings of the Mechanical Engineering
Conference Presentations, San Diego, CA, USA, 27–31 July 2008.
18. Song, Z.; Chung, R.; Zhang, X. An Accurate and Robust Strip-Edge-Based Structured Light Means for Shiny
Surface Micromeasurement in 3-D. IEEE Trans. Ind. Electron. 2013, 60, 1023–1032. [CrossRef]
19. Knauer, M.C.; Kaminski, J.; Hausler, G. Phase measuring deflectometry: A new approach to measure specular
free-form surfaces. Proc. SPIE 2004, 5457, 5411–5457.
20. Bothe, T.; Li, W.; von Kopylow, C.; Juptner, W.P.O. High-resolution 3D shape measurement on specular
surfaces by fringe reflection. Proc. SPIE 2004, 5457, 5412–5457.
Appl. Sci. 2019, 9, 2060 18 of 18

21. Nguyen, M.T.; Ghim, Y.-S.; Rhee, H.-G. Single-shot deflectometry for dynamic 3D surface profile measurement
by modified spatial-carrier frequency phase-shifting method. Sci. Rep. 2019, 9, 3157. [CrossRef] [PubMed]
22. Wiedenmann, E.; Scholz, T.; Schott, R.; Tusch, J.; Wolf, A. First Utilization of Energy Transfer in Structured
Light Projection—Infrared 3D Scanner. Key Eng. Mater. 2014, 613, 141–150. [CrossRef]
23. Ihrke, I.; Kutulakos, K.; Lensch, H.P.A.; Magnor, W.H. State of the Art in Transparent and Specular Object
Reconstruction. In Proceedings of the STAR Proc. of Eurographics, Crete, Greece, 14–18 April 2008.
24. Wolff, L.B. Using polarization to separate reflection components. In Proceedings of the Proceedings CVPR
’89: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, CA, USA,
4–8 June 1989; pp. 363–369.
25. Lamond, B.; Peers, P.; Ghosh, A.; Debevec, P.D. Image-based Separation of Diffuse and Specular Reflections
using Environmental Structured Illumination. In Proceedings of the 2009 IEEE International Conference on
Computational Photography (ICCP), San Francisco, CA, USA, 16–17 April 2009; IEEE: San Francisco, CA,
USA, 2009.
26. SCILLC. Soldering and Mounting Techniques. Reference manual; Rev. 11, SOLDERRM/D; On semiconductor
Ltd.: Phoenix, AZ, USA, March 2016; pp. 1–82.

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like