You are on page 1of 15

SS symmetry

Article
A New Simplification Algorithm for Scattered Point Clouds
with Feature Preservation
Miao Gong, Zhijiang Zhang * and Dan Zeng

Key Laboratory of Specialty Fiber Optics and Optical Access Networks, Joint International Research Laboratory
of Specialty Fiber Optics and Advanced Communication, Shanghai Institute for Advanced Communication and
Data Science, Shanghai University, 99 Shangda Road, Shanghai 200444, China; gongmiao@shu.edu.cn (M.G.);
dzeng@shu.edu.cn (D.Z.)
* Correspondence: zjzhang@shu.edu.cn

Abstract: High-precision and high-density three-dimensional point cloud models usually contain
redundant data, which implies extra time and hardware costs in the subsequent data processing
stage. To analyze and extract data more effectively, the point cloud must be simplified before data
processing. Given that point cloud simplification must be sensitive to features to ensure that more
valid information can be saved, in this paper, a new simplification algorithm for scattered point
clouds with feature preservation, which can reduce the amount of data while retaining the features of
data, is proposed. First, the Delaunay neighborhood of the point cloud is constructed, and then the
edge points of the point cloud are extracted by the edge distribution characteristics of the point cloud.
Second, the moving least-square method is used to obtain the normal vector of the point cloud and
the valley ridge points of the model. Then, potential feature points are identified further and retained
on the basis of the discrete gradient idea. Finally, non-feature points are extracted. Experimental
 results show that our method can be applied to models with different curvatures and effectively
 avoid the hole phenomenon in the simplification process. To further improve the robustness and
Citation: Gong, M.; Zhang, Z.; Zeng, anti-noise ability of the method, the neighborhood of the point cloud can be extended to multiple
D. A New Simplification Algorithm levels, and a balance between simplification speed and accuracy needs to be found.
for Scattered Point Clouds with
Feature Preservation. Symmetry 2021, Keywords: point cloud simplification; feature extraction; Delaunay neighborhood; moving least
13, 399. https://doi.org/10.3390/ square; discrete gradient
sym13030399

Academic Editor: Theodore E. Simos

1. Introduction
Received: 8 February 2021
Accepted: 24 February 2021
Three-dimensional (3D) scanning technology has always been the focus of research
Published: 28 February 2021
on computer vision, reverse engineering, and computer graphics [1]. A point cloud is the
most basic and popular data type collected through 3D scanning technology [2]. With the
Publisher’s Note: MDPI stays neutral
development of 3D scanning technology, the accuracy and density of 3D point cloud models
with regard to jurisdictional claims in
have increased continuously. This study focuses on the scattered data point cloud, which
published maps and institutional affil- is distributed irregularly and unordered. However, these 3D point cloud models contain
iations. huge amounts of redundant data. If all point cloud data are processed, the computer
will inevitably consume a large amount of running time and occupy huge storage space,
seriously affecting the efficiency of modeling and rendering. Therefore, on the premise of
ensuring the features of the point cloud model, simplifying the point cloud in the early
Copyright: © 2021 by the authors.
stage of point cloud processing is very important and practical.
Licensee MDPI, Basel, Switzerland.
Currently, the commonly used point cloud simplification methods include the bound-
This article is an open access article
ing box algorithm [3,4], the uniform grid method [5], the curvature sampling method [6],
distributed under the terms and the clustering method [7], the triangular grid method [8], etc. [9–12]. However, these
conditions of the Creative Commons methods have some shortcomings. To make the density of the point cloud more uniform,
Attribution (CC BY) license (https:// some methods ignore local details, which leads to inaccurate reconstruction results. Some
creativecommons.org/licenses/by/ methods, such as the curvature sampling method, can retain the original details of the
4.0/). point cloud well, but they are computationally intensive and inefficient. Based on this,

Symmetry 2021, 13, 399. https://doi.org/10.3390/sym13030399 https://www.mdpi.com/journal/symmetry


Symmetry 2021, 13, 399 2 of 15

many scholars have studied the simplification of the point cloud and proposed several
algorithms. For a structured point cloud, Markovic et al. [13] proposed a simplified point
cloud method that uses ε-insensitive support vector regression with spline and b-spline
kernels to find the significant points from high curvature areas in scanned lines, thereby
retaining the high level of initial information about the shape and structure of the scanned
object. However, a scattered data point cloud lacks a natural topological connection and
often experiences problems, such as uneven sampling, noise, and missing data. The simpli-
fied point cloud can easily have a bad impact on the appearance, modeling, and accurate
expression of the geometric model. Consequently, feature extraction and retention are
essential when a scattered data point cloud is simplified. For the feature extraction of
a point cloud model, scholars have done a lot of research [14–16]. On this basis, many
scholars have proposed point cloud simplification algorithms with feature preservation.
According to the neighborhood distribution characteristics of the model, some scholars
have proposed K-neighborhood-related simplification methods. Li et al. [17] first used a
k-d tree to segment point cloud data to establish a spatial topology and then combined
the change of the local normal vector and the reserved points as the threshold to simplify
the point cloud in the region. This algorithm prevents holes when the simplification ratio
is extremely high. Ji et al. [18] proposed a new K-neighborhood search method based on
distance and density, and feature points are selected according to their geometric features.
This method is named as the detail feature points simplified algorithm (DFPSA) for a 3D
point cloud. According to the morphological characteristics of the model, some scholars
put forward the simplification method of geometric algebra. Mahdaoui et al. [19] proposed
a point cloud iterative simplification method based on the density function and Shannon
entropy estimation. The algorithm is suitable for different point clouds with different
densities, and the reconstruction effect of the simplified point cloud is near the original
point cloud model. Yuan et al. [20] proposed a simplified algorithm based on conformal
geometric algebra. The spherical tree is used to construct multi-resolution subdivision,
and K-means clustering is used to calculate the minimum boundary sphere, and then
the adaptive point cloud simplification is carried out. Bernard et al. [21] used the asso-
ciated point distribution model to represent the statistical shape model, but due to the
heteroscedasticity of point cloud data, the surface generated using only the reconstruction
method of the probability model will have a certain deviation, resulting in a low-accuracy
curvature calculation value. Some scholars pay attention to the noise variables in the point
cloud model, such as Zhu et al. [22], who proposed a noisy three-dimensional model data
simplification approach. The core idea of this approach is to use the guided filter algorithm,
often used in 2D images, to reposition point cloud data, especially noisy point cloud data,
to enhance the features and geometric details.
Starting from what is already known in this field, a new simplification algorithm for
scattered point clouds with feature preservation is proposed, so that it can simplify the
point cloud and save as much information as possible. Based on the point cloud model’s
own morphological characteristics to divide its characteristics, based on the Delaunay
neighborhood of the point cloud, this paper uses the moving least-square (MLS) method
and discrete gradient ideas to extract the important points in point cloud mode, that is,
feature points, thereby simplifying the point cloud. The experimental results show that
this method can be applied to point cloud models with different shapes, noises, and orders
of magnitude.
The rest of this paper is organized as follows. Section 2 introduces the implementation
of our method and related formulas and describes the acquisition of the Delaunay neigh-
borhood of the point cloud and how to use the MLS method and the discrete gradient to
obtain the feature points of each part of the point cloud. Section 3 elaborates on the experi-
mental results, including the simplified results of our method under different models and
the conclusions after comparison with the other three methods. In the last section, some
conclusions and future work directions are given by observing the experimental results.
some conclusions and future work directions are given by observing the experimental
results.
Symmetry 2021, 13, 399 3 of 15
2. Methodology
The core idea of point cloud simplification is to remove a large amount of redundant
2. Methodology
data and retain feature data to ensure that the original model can be restored with as few
The core idea
points as possible. Point cloud of point
features cloud simplification
include is to remove
global and potential a largeGlobal
features. amountfea-of redundant
tures include edge and non-edge features. Potential features are the space divided after with as
data and retain feature data to ensure that the original model can be restored
few
global features are points as possible. Point cloud features include global and potential features. Global
extracted.
features include edge and non-edge features. Potential features are the space divided after
The flowchart of the proposed point cloud simplification method comprises four
global features are extracted.
stages, as shown in Figure 1. We assume that P is the original point cloud model, which
The flowchart of the proposed point cloud simplification method comprises four
can be defined as stages,
P = {pi(x,y,z)},
as showni[1,n], where
in Figure 1. Wepassume
i is the i-th point and n is the size of P. The
that P is the original point cloud model, which
simplified point cloud
can beset is denoted
defined as P =as{pR.
i
Specifically,
(x,y,z)}, i ∈ [1, after
n], wherepreprocessing the point
pi is the i-th point and cloud,
n is the size of P.
we construct the Delaunay neighborhood
The simplified point cloud of setthe point cloud
is denoted as R. P. According
Specifically, to the
after edge dis- the point
preprocessing
cloud, we
tribution characteristics ofconstruct the Delaunay
the points, the edge neighborhood of the point
points are judged, which cloud P. According
is Step 1 in theto the edge
figure. Then, the distribution
MLS method characteristics
is used to of the points,
calculate thethe edge points
normal vector areofjudged,
the pointwhich is Step 1 in the
cloud,
figure. Then, the MLS method is used to calculate the
and the valley ridge feature points are extracted according to the maximum curvature normal vector of the point cloud,
and the valley ridge feature points are extracted according to the maximum curvature
threshold, which is Step 2. Then, according to the discrete gradient, the potential feature
threshold, which is Step 2. Then, according to the discrete gradient, the potential feature
points are furtherpoints
judged and retained, which is Step 3. Finally, in Step 4, the non-feature
are further judged and retained, which is Step 3. Finally, in Step 4, the non-feature
points are extracted by the uniform by
points are extracted sampling
the uniformmethod, and method,
sampling the sampling
and theinterval
sampling is interval
s. is s.

Step 1

Point Data Delaunay Distribution Edge feature


cloud P pre-processing neighborhood Characteristics points

Step 2
Valley-ridge
MLS YES
feature points

Simplification
NO Step 3 results R

Potential
Discrete gradient YES
feature points

NO

Step 4 Non-feature
points extraction

Figure 1. Algorithm flowchart. Figure 1. Algorithm flowchart.

2.1. Extraction of the


2.1.Point CloudofEdge
Extraction the Point Cloud Edge
Generally, edge Generally, edge have
point clouds point clouds
more have morethan
features features than valley-ridge
valley-ridge point point clouds. Thus,
clouds.
these edge points must be retained in the process of point cloud simplification.
Thus, these edge points must be retained in the process of point cloud simplification.
2.1.1. Delaunay Neighborhood
2.1.1. Delaunay Neighborhood
The most significant feature of the point cloud model is the lack of topological connec-
The most significant feature
tions between of data.
point the point cloudtaking
Therefore, model theispoint
the lack
cloud ofmodel
topological con- object,
as the research
nections betweenthe point data. Therefore, taking the point cloud model
neighborhood points of the point data should be calculated first. as the research ob-
ject, the neighborhoodFor points of the
the point point
cloud datawith
model should be calculated
a uniform samplingfirst.
density, the k-nearest neighbors
For the pointofcloud
a point can bewith
model selected to represent
a uniform its neighborhood.
sampling density, the However,
k-nearest the sampling density
neighbors
of many point cloud models varies, and the k-nearest neighbors
of a point can be selected to represent its neighborhood. However, the sampling density of feature points may be
of many point cloud models varies, and the k-nearest neighbors of feature points may be in this
located in one side of the area. Thus, the Delaunay neighborhood [23] is adopted
study, that is, the local plane is fitted to point pi in the sphere neighborhood and projected
located in one side of the area. Thus, the Delaunay neighborhood [23] is adopted in this
to the plane to obtain projection point qi . Then, we perform Delaunay triangulation on
study, that is, the local plane is fitted to point pi in the sphere neighborhood Delaunayand projected
projection point qi to obtain the Delaunay neighborhood (Np ) of point p. As shown
to the plane to obtain projection point qi. Then, we perform Delaunay triangulation on
1Delaunay
in Figure 2, the first-order Delaunay neighborhood Delaunayof p is recorded as Np
projection point qi to obtain the Delaunay neighborhood (Np ) of point p. As shown in .
1Delaunay
Figure 2, the first-order Delaunay neighborhood of p is recorded as Np .
mmetry 2021, 13, 399 4 of 15
Symmetry 2021, 13, 399 4 of 15
Symmetry 2021, 13, 399 4 of 15

p
p

Figure 2. First-order Delaunay neighborhood of point p.


Figure2.2.First-order
Figure First-orderDelaunay
Delaunayneighborhood
neighborhoodof
ofpoint
pointp.p.
2.1.2. The Edge Distribution Characteristics
2.1.2.
2.1.2. The
The EdgeEdge Distribution
DistributionCharacteristics
Characteristics
The edge points of the point cloud can be obtained by constructing the Delaunay
The edge points of the point
point cloud can be obtained
obtained by constructing
constructing the the Delaunay
neighborhood ofThe edge points
the point cloud and of the combining cloud
the canedgebedistribution by characteristics of Delaunay
neighborhood
neighborhood of
of the
thepoint
point cloud
cloud andandcombining
combining the edge
the edgedistribution
distribution characteristics
characteristicsof theof
the point cloud.
point
the cloud.
point cloud.
As shown in Figure 3, generally, for non-edge points, the points in the neighborhood
As
Asshown
shown in
in Figure
Figure 3,
3,generally,
generally, for
fornon-edge
non-edge points,
points, the
theinpoints
points ininthe
theneighborhood
neighborhood
are distributed around the point. In addition, for edge points, the points the neighbor-
are distributed around the point. In addition, for edge points, the points in
in the
the neighbor-
hood gatherare in adistributed
certain direction.around According
the point. In to addition,
the abovefor edgewe
rules, points,
use the theedge
points coeffi- neighbor-
hood
hood gather
gather inina acertain
certain direction.
direction. According
According to the
to above
the aboverules, we
rules, use
we the
use edge
the coefficient
edge coeffi-
cient Fp to measure
Fcient the aggregation degree of theofpoint distribution in the neighborhood
p to measure
F the aggregation degree the point distribution
p to measure the aggregation degree of the point distribution in the neighborhood
in the neighborhood of point
of point p. That
p. is, the unitunit
normal vector composed of aofpoint andand its neighbors is super-
imposed. AfterofThat
point
the
is,p.theThat is,normal
superposition the of
unit vector
normal
non-edge
composed
vector
points, composed
it
a point
tends to ofa azeroits neighbors
point and its
vector in
is superimposed.
neighbors
theory, is super-
After the
imposed. superposition
After the of non-edge
superposition points,
of non-edgeit tends to
points, a zero
it vector
tends to ina theory,
zero whilein
vector the edge
theory,
while the edge
point point is a non-zero vector, so the boundary points can be quickly and
whileisthe
roughly extracted
a non-zero
edge point
according
vector,
to the
so the boundary
is asuperposition
non-zero vector,
normal
points
so the can be quickly
boundary
vector. points
Theofformula
andcan roughly
be
of Fbelow:
p is
extracted
quickly and
according
roughly to the
extracted superposition
according tonormal
the vector.
superposition The formula
normal F p is
vector. shown
The formula of F p is
shown below:
shown below:


1 D ppi

Fp   i D, ∑
1 DFp pp = →pp
, (1)
1=1 D pp (1)
Fip   i i ,

D i 1 pp i
(1)
D i 1 ppi
the i-thpineighborhood
where pi is where is the i-th neighborhood
point of point point p andof point
D is thep and D is the
number of number
points inofthe points in the
neighborhood where p
neighborhood i is the
of p. The larger i-th
of p. theneighborhood
Theedge
larger point of
the edge Fcoefficient
coefficient point
p is, the more
p and
Fp is, D is the number
the pointpoints
the more non-uniform
non-uniform of in the
the point
neighborhood
distributiondistribution
around point pofis,p.the
around The
point
morelarger
p is, themore
the
detached edge coefficient
it detached
is from the F
it pointis, the
isp from the
cloud more
point
body non-uniform
cloud
and has bodytheandpoint
has
distribution
the characteristics around
the characteristics of edge points. of point
edge p is,
points. the more detached it is from the point cloud body and has
the characteristics of edge points.
Edge
Edge p
Edge
p
Edge
p
p

(a) (b)
(a) (b)
Figure 3. TheFigure
edge distribution
3. The edgecharacteristics of the point cloud.
distribution characteristics of the (a) Neighborhood
point distribution of
cloud. (a) Neighborhood distribution of
Figure
non-edge points; (b) 3. The edge distribution
Neighborhood characteristics
distribution of of the point cloud. (a) Neighborhood distribution of
edge points.
non-edge points; (b) Neighborhood distribution of edge points.
non-edge points; (b) Neighborhood distribution of edge points.
To improveTo theimprove
adaptability of the algorithm,
the adaptability after calculating
of the algorithm, the boundary
after calculating thepoint
boundary point
index F of the To improve
point, the adaptability of the algorithm, after calculating the boundary point
index F ofifthethepoint,
boundary
if thepoint index
boundary Fp of
point a point
index Fpsatisfies
of a pointEquation
satisfies(2), the
Equation (2), the
index
point is judged F of the
toisbejudged point,
the boundary if the boundary point index F p of a point satisfies Equation (2), the
point to be thepoint.
boundary point.
point is judged to be the boundary point.
Fp -mF  2F F−, m > 2σ , (2)
p F -m
p
F F  2F F , (2)
(2)
where mF is the mean value and F is the standard deviation of Fp.
where
where mmFF is
is the
the mean
mean value
value and FF is
and is the
theofstandard
standard deviation
deviation ofF
of Fpp..
Edge point clouds have many characteristics the model, so these points must be
Edge
Edge point
point clouds
clouds have
have many
many characteristics
characteristics of
of the
the model,
model, so
so these
these points
pointsmust
mustbe be
retained in the process of point cloud simplification.
retained in the process of point cloud simplification.
retained in the process of point cloud simplification.

2.2. Determination of Valley-Ridge Feature Points


Valley-ridge and potential feature points are evaluated based on normal vectors. This
section discusses the extraction of valley-ridge feature points. The MLS method [24,25] is
2.2. Determination of Valley-Ridge Feature Points
Valley-ridge and potential feature points are evaluated based on normal vectors. This
section discusses the extraction of valley-ridge feature points. The MLS method [24,25] is
used to fit the points in the first-order Delaunay neighborhood of point pi to a plane, and
Symmetry 2021, 13, 399 5 of 15
the normal direction of the plane is the normal vector of pi. Given that the direction of the
normal vectors obtained in this manner is uncertain, these normal vectors must be unified.
Unification
used to fitmust be performed
the points according
in the first-order to theneighborhood
Delaunay direction of ofthepoint
line pof sight to ensure the
i to a plane, and
consistency of the direction of the normal vector. Generally, the origin is regarded
the normal direction of the plane is the normal vector of pi . Given that the direction of theas the

normal vectors
viewpoint obtained
position, in this
and the linemanner is is
of sight uncertain, these normal
v = (−x,−y,−z) T. Thevectors
normalmust be unified.
vector of the point
Unification must be performed
→ →according to the direction of the line of sight to ensure the
cloud is adjusted,
consistency of thesuch that of
direction v∙nthe> normal
0. Finally, according
vector. Generally,to the adjusted
the origin normal as
is regarded vector,
the the
Gaussian curvature →
viewpoint position,(kand
g) and average curvature (kh) of each
the line of sight is v = (−x,−y,−z)T .point are calculated.
The normal vector ofThe
the prin-
2 → →
cipal curvature,
point k = kn ±suchkthat
cloud is adjusted, h -kg v ·ofn >the surfaceaccording
0. Finally, can be obtained, and normal
to the adjusted k takesvector,
the larger
absolute value curvature
the Gaussian as the curvature value. The
(kg ) and average threshold
curvature C Teach
(kh ) of is set. When
point the absolute
are calculated. The value
q
(|k|) of the curvature,
principal principal kcurvature
= kn ± of k2h a−point
kg of is
thelarger than
surface canCbe T∙mean(sum(|k|)), that is |k| >
obtained, and k takes the
CT∙mean(sum(|k|)),
larger absolute value as the curvature value. The threshold CT ismodel,
this point is the valley-ridge point of the which
set. When the is recorded as
absolute
V. value (|k|) of the principal curvature of a point is larger than CT ·mean(sum(|k|)), that
is |k| > CT ·mean(sum(|k|)), this point is the valley-ridge point of the model, which is
2.3.recorded
Determination
as V. of Potential Feature Points

2.3.The previous chapter


Determination discusses
of Potential the extraction of global features. The threshold set-
Feature Points
ting is large. Thus, only the obvious feature points in the model can be extracted. To avoid
The previous chapter discusses the extraction of global features. The threshold setting
ignoring
is large.the potential
Thus, feature
only the points,
obvious wepoints
feature assume in that V is the
the model canfirst
be kind of potential
extracted. To avoidfeature
points, and then extract the second kind of potential feature points based
ignoring the potential feature points, we assume that V is the first kind of potential feature on the discrete
gradient
points, from
and thenthe extract
point cloud in P. kind of potential feature points based on the discrete
the second
First, we define discrete
gradient from the point cloud in P. function fp, which is a feature detection operator at a point.
Then, theFirst, we define
discrete discrete
gradient atfunction
that pointfp , which is a feature
is calculated detection
by using the operator
operator. at As
a point.
shown in
Then, the discrete gradient at that point is calculated by using the operator.
Figure 4, θpi is the angle formed by point p and its neighborhood point pi in the principal As shown in
Figure 4, θ pi is the angle formed by point p and its neighborhood point
direction and can measure the degree of bending of a potential surface at a point. The pi in the principal
direction and can measure the degree of bending of a potential surface at a point. The
principal direction in this work is the direction of eigenvector v0 corresponding to the
principal direction in this work is the direction of eigenvector v0 corresponding to the
minimum
minimumeigenvalue
eigenvalue of ofthe
the covariance
covariance matrix matrix of aand
of a point point andneighborhood
its local its local neighborhood
points.
points. The formula
The formula of covariance
of covariance matrix Tmatrix T is as follows:
is as follows:

T =T ∑ 
 ( p (−pic
ip N
)(cp)( pi Tc)
i − c) ,
T
, (3) (3)
pi ∈ Npi p
k
where c = ∑i=1pk i is the centroid of Np. Np is the Delaunay neighborhood of point p, and the
where c = ∑ 1 pi is the centroid of Np . Np is the Delaunay neighborhood of point p, and
eigenvalues ofi=covariance matrix T are λ0 < λ1 < λ2.
the eigenvalues of covariance matrix T are λ0 < λ1 < λ2.

vo

p
ɵpi

Pi+5 d Pi+4
2

pi d1 o Pi+3

Pi+2
Pi+1

Figure
Figure 4. 4. Illustrationofofthe
Illustration angle θθpipi. .
theangle

Angle θ can be expressed as


Angle θpi pican be expressed as
d2 d
cos θ pi =
cos  pikp − pi k22 , , (4)
(4)
p  pi 2
where kp − pi k is the Euclidean distance between p and pi and pi is the point in the Delaunay
neighborhood of p. cosθ pi decreases monotonously in the interval 0 < pi < 180, that is, the
Symmetry 2021, 13, 399 6 of 15

larger the value of pi is, the smaller the value of cosθ pi is, and the smoother the feature is at
this point; on the contrary, the larger the value of cosθ pi is, the sharper the feature at this
point is. Therefore, the local feature detection operator (fp ) of point p is defined as

1 n
n i∑
fp = cos α pi , (5)
=1

where the larger the value of fp is, the greater the bending degree of the potential surface in
the local neighborhood of point p is.
The discrete gradient (Gi ) between points p and pi in its neighborhood is set as

fp − fp
i
Gp = , (6)
| p − pi |

where p − pi is the shortest path length between points p and pi , which is defined as the
geodesic distance between two points.
According to discrete Morse theory, discrete gradient Gi is used as the criterion to
extract the second type of potential feature points. The specific criterion is

g = { Gi > MGi }, (7)

1 n
n i∑
MGi = Gi . (8)
=1
When the above condition is satisfied, the corresponding points in the neighborhood
are selected. Then, according to the number of times (Ui ) this point appears as the neigh-
borhood of the first kind of potential feature points, the second kind of potential feature
points are screened and distinguished, that is,

l = { gi > Mgi }, (9)

where Mgi = (Tg × Ui ), Tg∈[0, 1] is the threshold value.

3. Experimental and Discussion


3.1. Preparation
MATLAB R2016a was used to program the algorithm. The computer was configured
with a Winl0 system, 3.4 GHz Intel Core i7, 16 G memory, and NVDIA GeForceGtX960M
graphics card. To verify the effectiveness of the proposed point cloud simplification
algorithm, we selected four groups of point cloud data with different characteristics for
simplification analysis. The feature sharpness characteristics of these four data vary, and
their potential surfaces are uneven. The experiment evaluated the method from two
aspects, namely simplification and reconstruction effects, and compared it with the grid-
based, curvature-based simplification methods in Geomagic and feature-preserving [20]
simplification methods.
As shown in Figure 5a,b, the point cloud data of Bunny and Dragon models are all
from the 3D point cloud database established by Stanford University. The Dragon model
has 437,645 points, while the Bunny model has 35,947 points. The point cloud data of the
Mask model are obtained from the 3D measurement system built in our laboratory, and
the measured object is shown in Figure 6a. This system is shown in Figure 6b, consisting
of one projector and four cameras. Group A’s camera model is Manta G-504, with a
2452 × 2056 resolution and a pixel size of 3.45 µm; Group B’s camera model is MER-
500-14GM /CP, with a 2592 × 1944 resolution and a pixel size of 2.2 µm. The lens is
OPT-165M. The projector is DLP Light Crafter 4500. The Mask point cloud model obtained
by Group A’s cameras has 2,194,425 points, and that obtained by Group B’s cameras has
3,367,616 points.
Symmetry 2021, 13, 399 7 of 15
Symmetry 2021, 13, 399 7 of 15

Symmetry 2021, 13, 399 cameras has 2,194,425 points, and that obtained by Group B’s cameras has 3,367,616 7 of 15
cameras has 2,194,425 points, and that obtained by Group B’s cameras has 3,367,616
points.
points.

(a) (b)
(a) (b)
Figure 5.Figure
(a) Bunny
5. (a)and (b) dragon.
Bunny and (b) dragon.
Figure 5. (a) Bunny and (b) dragon.

(a) (b)
(a) (b)
Figure 6. (a) Mask and (b) experimental setup.
FigureFigure
6. (a) Mask and (b)
6. (a) Mask experimental
and setup.setup.
(b) experimental
3.2. Results and Analysis
3.2. Results
3.2. Results and Analysis
and Analysis
The simplification of a point cloud requires deleting redundant points and retaining
The localThe simplification
simplification ofmuch of a point
a point cloud requires deleting redundant points and retaining
global and features as ascloud
possiblerequires
on thedeleting
premiseredundant of meetingpoints and retaining
the requirements
global and local features as much as possible on the premise of meeting the requirements
ofglobal and local features
the simplification rate. The as much
formula as possible on the premise
of simplification rate is as offollows:
meeting the requirements
of the simplification rate. The formula of simplification
of the simplification rate. The formula of simplification rate is as follows: rate is as follows:
PR
  P | P R, − R| (10)
 ρ P= , , (10) (10)
P P
where  is the simplification rate, and the bigger the value is, the more points are deleted,
where
that is, the is
where ρ issimplification
the
greater
the simplification
the simplification rate, andrate,the
andbigger
degree is;
the bigger
the value
otherwise,
the value
fewer is, the is, the more
more
points arepoints
points
deleted,arethat are deleted,
deleted,
is,
that that
is, the is, the greater
greater the the simplification
simplification degree degree
is; is; otherwise,
otherwise, fewer fewer
points points
are are deleted,
deleted, that that is,
is,
the simplification degree is lower.
theWhenthe simplification
simplification degree degree
is is lower.
lower.
the simplification rate is too low, the appearance and details of the simplified
When the simplification israte is too the
low,appearance
the appearance and details
of theofsimplified
the simplified
model When
and the theoriginal
simplification
model will rate not too
be low,
very different. Whenand thedetails
simplification rate is
model model
and and the
the original original
model model will
will not not
be be
very very different.
different. When the
When theTherefore, simplification
simplification rateisis too
rate
too high, the
high, simplified
the simplified modelmodel is prone
is to
prone holes
to after
holes reconstruction.
after reconstruction. Therefore, we deter-
we determine
too high,
mine the the simplified
approximate model
range is according
ofof prone to holes to theafter reconstruction.
number of the Therefore,
original model wepoint
deter-
mine the
the approximate
approximate range
range of ρaccording
according totothethe number
number ofofthetheoriginal
original model
model point
point clouds.
clouds. The
TheBunnyBunnyand andDragon
Dragonmodels’ models’ρ ranges rangesfrom from 20%
20%20% to
to 80%, 80%, and
and that that of Mask A
clouds. The Bunny and Dragon models’  ranges from to 80%, and of thatMask A andAMask
of Mask
and Mask B B models
models ranges
ranges from from
35% 35%to to 90%.
90%.
andTheMaskentireB modelscloud ranges from 35% to process 90%. of the proposed method is divided into
The Thepoint
entire entire
point point
cloud
simplification
cloud simplification
simplification process process
of the of the proposed
proposed method ismethod
divided isinto
divided
four stages:
into edge stages:
four point detection,
edge point valley-ridge
detection, feature
valley-ridge point estimation,
feature pointpotential feature
estimation, potential
fourrecognition,
point stages: edgeand point detection,point
non-feature valley-ridge
extraction. feature
Taking point the estimation,
Bunny model potential
as Bunny feature
an exam-
point feature
recognition, pointand recognition,
non-feature andpointnon-feature
extraction. point extraction.
Taking the BunnyTakingmodelthe as an model as
exam-
ple, whenan  = 53%, the
example, when point = cloud
53%, data
the obtained
point cloud by
data each step ofby
obtained our method
each step areour
of shown
inple, when
Figure 7. For= 53%,
convenience
ρ
the pointof cloud data obtained
distinguishing, thebypointeach clouds
step of obtained
our method are shown are
method
at different
shown in Figure 7. For convenience of distinguishing, the point clouds obtained at different
in Figure
stages are 7. For convenience ofcolors.
distinguishing, theare point clouds obtained points
at different
stages are displayed in different colors. Edge points are purple, valley-ridge are
displayed in different Edge points purple, valley-ridge points are
stages
green, are displayed
potential feature in different
points are colors.
blue, and Edge points are
non-feature points purple,
are valley-ridge
red. It can points are
green, potential feature points are blue, and non-feature points are red.be It seen
can be that seen that
green,
valley-ridgepotential feature
feature feature points
point estimation are blue, and
and potentialnon-feature
feature featurepoints are
point recognitionred. It can be seen
complement that
valley-ridge point estimation and potential point recognition complement
valley-ridge
each other, feature point estimation and potential feature point recognition complement
eachobtaining good model
other, obtaining good details
model (such
detailsas(suchears,as torso,
ears,and face),
torso, andwhile
face),edge
while point
edge point
each other,
detection andobtaining
non-feature good pointmodel details are
extraction (suchgood as ears, torso,smoothand face), while edge point
detection and non-feature point extraction arefor goodfilling areas
for filling smooth (such
areasas(suchthe as the
detection and non-feature point extraction are good for filling smooth areas (such as the
back). Thus, the final result can be preserved well regardless of the feature (i.e., contour or
detail feature).
Symmetry 2021, 13, 399 8 of 15

Symmetry 2021, 13, 399 8 of 15


back). Thus, the final result can be preserved well regardless of the feature (i.e., contour
or detail feature).

(a) (b) (c) (d) (e) (f)


Figure 7.
Figure 7. (a)
(a) Edge
Edge points,
points, (b)
(b) valley-ridge
valley-ridge feature
feature points,
points, (c)
(c) potential
potential feature
feature points,
points, (d) non-feature points,
(d) non-feature points, (e)
(e) simplified
simplified
results, and (f) reconstruction results.
results, and (f) reconstruction results.

3.2.1. Setting Parameters


3.2.1. Setting Parameters
At the beginning of the study, we hoped to reduce the influence of artificial parame-
At the beginning of the study, we hoped to reduce the influence of artificial parameter
ter adjustment in the algorithm on point cloud simplification as much as possible, so the
adjustment in the algorithm on point cloud simplification as much as possible, so the
number
number of of artificial
artificial thresholds
thresholdswas wasset setasaslow lowasaspossible.
possible.According
According toto
thethe introduction
introduction in
in
Section 2, we can know from the formula that the parameters to be adjusted are Care
Section 2, we can know from the formula that the parameters to be adjusted , T C,T,and
Tg ,
T g
and s. Different
s. Different parameters
parameters have have
differentdifferent
effectseffects on different
on different types of types
pointofclouds.
point clouds.
According Ac-
cording to the definition
to the definition of each threshold,
of each threshold, if the value if theofvalue
CT isof C T is between
between 0 to 1, too0 tomany1, toopoints
many
points with inconspicuous valley and ridge features will be stored,
with inconspicuous valley and ridge features will be stored, which will affect the extraction which will affect the
extraction of potential feature points, so the value of C T should be greater than 1. An ex-
of potential feature points, so the value of CT should be greater than 1. An excessively
cessively
large CT will largealsoCT affect
will also
theaffect
subsequent the subsequent
simplificationsimplification steps. Depending
steps. Depending on the size onandthe
size and shape of the point cloud model, the upper limit of C T value is also different. After
shape of the point cloud model, the upper limit of CT value is also different. After repeated
repeated
trials andtrials and comparison,
comparison, the maximum the maximum
value of value of C T is
CT is about 5, about
otherwise5, otherwise
there arethere too feware
too few valley-ridge feature points. T is a weight coefficient, and
valley-ridge feature points. Tg is a weight coefficient, and generally, its value range is
g generally, its value range
is (0,1).
(0,1). However,
However, depending
depending on on thethe number
number of original
of original pointpoint clouds,
clouds, the value
the value range range
is also is
also different. In the case of a small number of original point
different. In the case of a small number of original point clouds, the value of Tg is (0,0.5).clouds, the value of T g is
(0,0.5). s is the sampling interval of the non-feature point cloud,
s is the sampling interval of the non-feature point cloud, and its value depends on the and its value depends on
the number
number of feature
of feature pointspoints
retained retained
after after
the firstthe three
first three
steps steps of simplification
of simplification and the and the
target
target simplification
simplification ratepoint
rate of the of the pointTherefore,
cloud. cloud. Therefore, in the following
in the following discussion,discussion,
we only focus we
only
on the focus on theofselection
selection CT and Tof g . C T and T g .
Figure
Figure 8 shows the effect of the selection of C Ctt and Tgg values on the final simplified
result of the Bunny model model when when the the non-feature
non-feature point point sampling
sampling interval
interval ss is is fixed
fixed at at 50.
50.
Each line of of images,
images,fromfromleft
lefttotoright,
right,are areedge edge points,
points, valley-ridge
valley-ridge points,
points, potential
potential fea-
feature
ture points,
points, non-feature
non-feature points,
points, the final
the final simplified
simplified pointpoint cloud
cloud model,
model, andand its reconstruc-
its reconstruction
model.
tion It canIt be
model. canseen that the
be seen thatlarger the values
the larger of Ct and
the values of CTt gand
are,Tthe less
g are, the points are saved.
less points are
Each step
saved. Eachofstepour ofmethod is interlocking.
our method is interlocking. The selection of valley-ridge
The selection of valley-ridge points and and
points the
preservation
the preservation of potential
of potentialfeature
feature points
points arearecomplementary
complementarytotoeach eachother,
other,and and they
they can
extract the details of the model very well. The appropriate
extract the details of the model very well. The appropriate adjusting of the number of non-adjusting of the number of
non-feature
feature points points can reshape
can help help reshape the entire
the entire simplified
simplified model.model. In summary,
In summary, for a cloud
for a point point
cloud model
model with uniform
with uniform sampling sampling and distinctive
and distinctive features,features,
Ct and C Tgt and Tg to
are set aresmall
set tovalues,
small
values,
and s canand besselected
can be selected
according according to the requirements
to the requirements of the simplification
of the simplification rate. is
rate. If there If
there is no special requirement, s can
no special requirement, s can be set to a medium value. be set to a medium value.

3.2.2. The Self-Adapting Experiment Parameters


Due to the various forms of point cloud data acquisition, there are also many differ-
ences in point cloud models. The method in this paper simplifies the four different types of
models in Figures 5 and 6 to ensure the applicability and feasibility of the method.
The point cloud model of the simplified results at different simplification rates and
the corresponding reconstruction effects are shown in Figure 9, where Figure 9a–d are
Bunny, Dragon, Mask A, and Mask B models, respectively. When ρ = 0, it means that the
corresponding model in Figure 8 is the original model. With an increase in the simplification
rate, the simplified model will lose some texture information and local detail features with
too few point clouds after simplification. For example, when ρ = 71.4%, the ear part of the
Bunny model is slightly missing. However, it can be seen from Table 1 that the maximum
error, average error, and root-mean-square (RMS) error are kept at a small level, indicating
Symmetry 2021, 13, 399 9 of 15

that the simplified model basically retains the geometric features and contour appearance of
the original model. Therefore, the proposed point cloud simplification algorithm effectively
Symmetry 2021, 13, 399 retains the geometric information in different models, such as the boundary, high curvature
9 of 15
points, and local features.

.
(a)

(b)

(c)

(d)
Figure 8. The effect of different threshold selections on the Bunny model,
model, where
where ss ==50.
50.(a)
(a)CCTT==1.1,
1.1,TT
g=
g 0.1; (b)(b)
= 0.1; CTC=T1.1, Tg
= 1.1,
=g0.5;
T (c) (c)
= 0.5; CT C= T5,=T5,
g =T0.1;
g = and (d) C(d)
0.1; and T =C
5,T T=g 5,
= 0.4.
Tg = 0.4.

3.2.2.
3.2.3. The Self-Adapting
Comparison of the Experiment Parameters
Simplification Effect with Other Methods
Due to the verify
To further variousthe
forms of point cloud
effectiveness of ourdata acquisition,
method, in this there are we
chapter, alsoused
many thediffer-
grid-
ences
based,incurvature-based
point cloud models. The method
simplification in thisinpaper
methods simplifies
Geomagic and the four different types
feature-preserving [20]
of models in Figures
simplification methods 5 and 6 to ensure
to simplify the the
dataapplicability
of Bunny and and feasibility
Mask of the
A models method.
and compared
the results
The pointwithcloud
our method.
model of the simplified results at different simplification rates and
The simplifiedreconstruction
the corresponding results are showneffectsinare
Figure
shown 10.inThe surface
Figure of theFigure
9, where Bunny 9a–dmodel
are
Bunny, Dragon, Mask A, and Mask B models, respectively. When  = 0, it means that the
has many wrinkles. The geometric details show that the feature points retained by our
method are more
corresponding continuous
model in Figureand8 is uniform,
the original whereas
model.the other
With three methods
an increase lose a lot
in the simplifica-
of feature
tion information
rate, the simplifiedwhen
modelthe simplification
will lose some texturerates information
are high. Especially
and localfor bunny
detail ears,
features
the difference
with too few pointis most obvious.
clouds after Because the number
simplification. of pointwhen
For example, clouds  in the earthe
= 71.4%, part earispart
not
very
of thelarge,
Bunny it ismodel
easy toishave holes
slightly and deformations
missing. However, itif canwe dobe not
seenpay attention
from Table 1tothat
feature
the
preservation during the simplification process. For the Mask A
maximum error, average error, and root-mean-square (RMS) error are kept at a small model with an uneven
featureindicating
level, distribution,thatthe
theproposed
simplifiedmethod
modelretains more
basically pointsthe
retains in the concavefeatures
geometric and convex and
contour appearance of the original model. Therefore, the proposed point cloudretention
changes, while the curvature-based simplification method focuses on feature simplifi-
and ignores
cation the extraction
algorithm effectively of points
retains theingeometric
the smoother area, resulting
information in more
in different holes.
models, suchThe
as
grid-based
the boundary, andhigh
the feature-preserving
curvature points, and simplification methods retain too many non-feature
local features.
points in a few flat regions, which makes the face of Mask A too smooth, but the real face
of Mask
Table A is uneven.rate and evaluation of results.
1. Simplification

Error/(mm)
Root Mean
Model Original Points /(%)
Maximum Average Square
(RMS)
17.6 8.904 × 10−3 8 × 10−5 1.95 × 10−4
Bunny 35,947
53 2.1136 × 10−2 2.04 × 10−4 5.96 × 10−4
35.9 5.7673 × 10−2 2.69 × 10−4 1.119 × 10−3
Mask A 2,194,425 79.8 8.9860 × 10−2 6.58 × 10−4 2.571 × 10−3
89.9 9.8986 × 10−2 9.4 × 10−4 3.797 × 10−3
38.7 4.968 × 10−2 9.67 × 10−4 1.936 × 10−3
Symmetry 2021, 13, 399
Mask B 3,367,616 72.1 8.2321 × 10−2 1.433 × 10−3 2.16 × 10−3 10 of 15
92.8 9.0322 × 10−2 2.647 × 10−3 4.927 × 10−3

=0  = 17.6%

 = 53.0%  = 71.4%
(a)

=0  = 25.1%

Symmetry 2021, 13, 399 11 of 16


 = 44.9%  = 78.0%
(b)

=0  = 35.9%

 = 79.8%  = 89.9%
(c)

=0  = 38.7%

 = 72.1%  = 92.8%
(d)
Figure 9. Simplification results of our method: (a) Bunny, (b) Dragon, (c) Mask A, and (d) Mask B
Figure 9. Simplification results of our method: (a) Bunny, (b) Dragon, (c) Mask A, and (d) Mask
models.
B models.
3.2.3. Comparison of the Simplification Effect with Other Methods
To further verify the effectiveness of our method, in this chapter, we used the grid-
based, curvature-based simplification methods in Geomagic and feature-preserving [20]
simplification methods to simplify the data of Bunny and Mask A models and compared
the results with our method.
Symmetry 2021, 13, 399 11 of 15

Table 1. Simplification rate and evaluation of results.

Error/(mm)
Model Original Points ρ/(%)
Maximum Average Root Mean Square (RMS)
17.6 8.904 × 10−3 8 × 10−5 1.95 × 10−4
Bunny 35,947 53 2.1136 × 10−2 2.04 × 10−4 5.96 × 10−4
71.4 6.9928 × 10−2 1.89 × 10−3 3.075 × 10−3
25.1 4.614 × 10−2 3.3 × 10−5 4.69 × 10−4
Dragon 437,645 44.9 4.9589 × 10−2 6.7 × 10−5 4.95 × 10−4
78.0 4.6156 × 10−2 2.63 × 10−4 6.22 × 10−4
35.9 5.7673 × 10−2 2.69 × 10−4 1.119 × 10−3
Mask A 2,194,425 79.8 8.9860 × 10−2 6.58 × 10−4 2.571 × 10−3
89.9 9.8986 × 10−2 9.4 × 10−4 3.797 × 10−3
38.7 4.968 × 10−2 9.67 × 10−4 1.936 × 10−3
Mask B 3,367,616 72.1 8.2321 × 10−2 1.433 × 10−3 2.16 × 10−3
92.8 9.0322 × 10−2 2.647 × 10−3 4.927 × 10−3

The above is the intuitive visual impression of the simplified results obtained by the
four methods. The following is a detailed analysis of the error comparison of the four
methods based on numerical calculation.
Table 2 shows the error comparison between the simplified model and the original
Bunny and Mask A models under different simplification rates of different methods. It
can be seen that the error of our proposed method is the smallest, followed by the feature-
preserving method, and the results of the other two methods are relatively poor. Since
their simplification rates are similar but not exactly the same, it is not easy to observe in
the table, and Figure 10 was drawn to make the numerical comparison more intuitive.

Table 2. Error evaluation of the simplified result of Bunny and Mask A models.

Error/(mm)
Model Method ρ/(%)
Maximum Average RMS
17.6 8.904 × 10−3 8×10−5 1.95 × 10−4
Our method 53 2.1136 × 10−2 2.04 × 10−4 5.96 × 10−4
71.4 6.9928 × 10−2 1.89 × 10−3 3.075 × 10−3
16.7 1.8284 × 10−2 1.61 × 10−4 3.17 × 10−4
Grid-based 40.3 5.9634 × 10−2 2.42 × 10−4 7.07 × 10−4
Bunny 69.8 9.9972 × 10−2 3.143 × 10−3 6.789 × 10−3
20.2 1.7422 × 10−2 1.78 × 10−4 3.16 × 10−4
Curvature-based 41.6 4.7625×10−2 2.95 × 10−4 8.65 × 10−4
69.8 9.9776 × 10−2 2.992 × 10−3 5.605 × 10−3
19.6 1.8519 × 10−2 1.31 × 10−4 2.22 × 10−4
Feature-preserving
40.1 2.4632 × 10−2 3.3 × 10−4 7.41 × 10−4
[20]
68.5 7.7979 × 10−2 2.1 × 10−3 3.89 × 10−3
35.9 5.7673 × 10−2 2.69 × 10−4 1.119 × 10−3
Our method 79.8 8.9860 × 10−2 6.58 × 10−4 2.571 × 10−3
89.9 9.8986 × 10−2 9.4 × 10−4 3.797 × 10−3
39.4 9.7915 × 10−2 5.11 × 10−4 2.777 × 10−3
Grid-based 69.1 9.8975 × 10−2 1.438 × 10−3 6.856 × 10−3
Mask A 92.4 9.8648 × 10−2 1.804 × 10−3 7.481 × 10−3
38 7.54 × 10−2 4.44 × 10−4 1.908 × 10−3
Curvature-based 69 9.8968 × 10−2 1.253 × 10−3 6.196 × 10−3
92 9.8648 × 10−2 1.78 × 10−3 7.534 × 10−3
40.6 6.7238 × 10−2 4.65 × 10−4 2.075 × 10−3
Feature-preserving
67.7 9.3995 × 10−2 7.82 × 10−4 3.591 × 10−3
[20]
89.6 9.8656 × 10−2 136.9 × 10−3 5.681 × 10−3
Symmetry 2021,13,
Symmetry2021, 13,399
399 12
12of
of15
15

 = 16.7%  = 40.3%  = 69.8%


(a)

 = 20.2%  = 41.6%  = 69.8%


(b)

 = 16.9%  = 40.1%  = 68.5%


(c)

 = 39.4%  = 69.1%  = 92.4%


(d)

 = 38.0%  = 69.0%  = 92.0%


(e)

 = 40.6%  = 67.6%  = 89.6%


(f)
Figure10.
Figure 10.Simplification
Simplificationresults
resultsofofthe
thegrid
grid method:
method: (a)(a) Bunny
Bunny and
and (d)(d) Mask
Mask A models.
A models. Simplification
Simplification result
result of the
of the curva-
curvature
ture adaptive method: (b) Bunny and (e) Mask A models. Simplification results of the feature-preserving method
adaptive method: (b) Bunny and (e) Mask A models. Simplification results of the feature-preserving method [18]: (c) Bunny [18]: (c)
Bunny and (f) Mask A models.
and (f) Mask A models.
Symmetry 2021, 13, 399 13 of 15

It can be seen that Figure 11 shows the error relationship between the simplified model
and the original model under different methods. In the simplification of the Bunny model,
the error results of the four methods are similar when their simplification rate is small, and
our method is slightly better. When the simplification rate increases gradually, the errors of
these four methods all increase, but obviously, the error of our method is much smaller. In
the simplification of the Mask A model, the feature-preserving method and our method
Symmetry 2021, 13, 399 have greater advantages, and our method is better overall. In summary, our method 14 of 15can
detect and extract the features of different point cloud models better, while retaining the
edge points, making the 3D point cloud model more realistic and comprehensive.

(a) (b) (c)

(d) (e) (f)


Figure 11.
Figure 11. Simplified
Simplified error
errorcomparison
comparisonofofthe
theBunny
Bunnymodel:
model:(a)(a)
maximum error,
maximum (b)(b)
error, average error,
average andand
error, (c) RMS error.
(c) RMS error.
Simplified error comparison of the Mask A model: (d) maximum error, (e) average error, and (f) RMS error.
Simplified error comparison of the Mask A model: (d) maximum error, (e) average error, and (f) RMS error.

4. Conclusions
4. Conclusions
To retain as many feature points as possible when removing a large amount of re-
dundantTo retain as many
information feature points and
in high-precision as possible whenpoint
high-density removing a large
clouds, amount
this paper of redun-
proposes
a new simplification algorithm for scattered point clouds with feature preservation. The a
dant information in high-precision and high-density point clouds, this paper proposes
new simplification
simplified algorithm
method makes for scattered
full use pointinformation
of the feature clouds withabout
feature
thepreservation.
original pointThe
simplified
cloud. First,method makesneighborhood
the Delaunay full use of theof feature information
the point about the original
cloud is established, and thepoint
contourcloud.
First, the Delaunay neighborhood of the point cloud is established, and the
of the model is extracted according to the edge distribution characteristics of the point contour of the
model is extracted according to the edge distribution characteristics
cloud. Second, the valley-ridge points are preserved by the curvature using the MLS of the point cloud.
Second, Then,
method. the valley-ridge
according to points are preserved
the discrete gradientbyidea,
the curvature using
the potential the MLS
feature pointsmethod.
are
Then, according to the discrete gradient idea, the potential feature points
found. Finally, the non-feature points are sampled simply and comprehensively. The pro- are found. Finally,
the non-feature
posed method haspoints
good are sampled simply
expansibility and comprehensively.
and adaptability. Thewe
For future work, proposed
would likemethod
to
extend the first-order Delaunay neighborhood to a multi-order neighborhood to furtherthe
has good expansibility and adaptability. For future work, we would like to extend
first-order
improve theDelaunay
robustnessneighborhood
and anti-noiseto a multi-order
ability neighborhood
of the algorithm. However, to in
further improve
the discrete
the robustness
gradient and anti-noise
calculation, the shortestability of the algorithm.
path length between two However,
points isin the discrete
usually taken asgradient
the
calculation, the shortest path length between two points is usually
approximate geodesic distance, which is time consuming and thus requires further re-taken as the approxi-
mate geodesic
search distance,towhich
and optimization find aisbalance
time consuming and thus speed
between simplified requires
andfurther
accuracy.research and
optimization to find a balance between simplified speed and accuracy.
Author Contributions: Conceptualization, Z.Z. and D.Z.; methodology, M.G., Z.Z. and D.Z.; soft-
Author
ware, Contributions:
M.G.; Conceptualization,
validation, M.G.; formal analysis,Z.Z.
M.G.;and D.Z.; methodology,
investigation, M.G., Z.Z.
M.G.; resources, Z.Z.;and
dataD.Z.;
cura-soft-
ware,M.G.;
tion, M.G.;writing—original
validation, M.G.;draft
formal analysis, M.G.;
preparation, M.G.;investigation, M.G.;and
writing—review resources,
editing,Z.Z.;
M.G.,data
Z.Z.curation,
and
D.Z.;
M.G.;visualization, M.G.; supervision,
writing—original Z.Z. and
draft preparation, D.Z.;
M.G.; project administration,
writing—review M.G.;M.G.,
and editing, funding acquisi-
Z.Z. and D.Z.;
tion, Z.Z. and D.Z.
visualization, M.G.;Allsupervision,
authors haveZ.Z.
readand
andD.Z.;
agreed to theadministration,
project published version of the
M.G.; manuscript.
funding acquisition,
Z.Z. and D.Z. All authors have read and agreed to the published version of the
Funding: This work was supported by the National Natural Science Foundation of China
manuscript.
(61572307).
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: Not applicable.
Conflicts of Interest: The authors declare no conflict of interest.
Symmetry 2021, 13, 399 14 of 15

Funding: This work was supported by the National Natural Science Foundation of China (61572307).
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: Not applicable.
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Thanou, D.; Chou, P.A.; Frossard, P. Graph-Based Compression of Dynamic 3D Point Cloud Sequences. IEEE Trans. Image Process.
2016, 25, 1765–1778. [CrossRef] [PubMed]
2. Zhang, K.; Qiao, S.; Wang, X.; Yang, Y.; Zhang, Y. Feature-Preserved Point Cloud Simplification Based on Natural Quadric Shape
Models. Appl. Sci. 2019, 9, 2130. [CrossRef]
3. Sun, W.; Bradley, C.; Zhang, Y.; Loh, H. Cloud data modelling employing a unified, non-redundant triangular mesh. Comput. Des.
2001, 33, 183–193. [CrossRef]
4. Weir, D.J.; Milroy, M.; Bradley, C. Reverse engineering physical models empolying wrap-aroud B-spline surfaces and quadrics. J.
Eng. Manuf. 1996, 210, 147–157. [CrossRef]
5. Martin, R.R.; Stroud, I.A.; Marshall, A.D. Data reduction for reverse engineering. Proc. Inf. Geometers Conf. 1997, 10, 85–100.
6. Kim, S.-J.; Kim, C.-H.; Levin, D. Surface simplification using a discrete curvature norm. Comput. Graph. 2002, 26, 657–663.
[CrossRef]
7. Zhao, P.; Wang, Y.; Hu, Q. A feature preserving algorithm for point cloud simplification based on hierarchical clustering. In
Proceedings of the IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Beijing, China, 10–15 July 2016; pp.
5581–5584. [CrossRef]
8. Chen, Y.H.; Ng, C.; Wang, Y. Data reduction in integrated reverse engineering and rapid prototyping. Int. J. Comput. Integr. Manuf.
1999, 12, 97–103. [CrossRef]
9. Han, H.; Han, X.; Sun, F.; Huang, C. Point cloud simplification with preserved edge based on normal vector. Optik 2015, 126,
2157–2162. [CrossRef]
10. Tonini, M.; Abellán, A. Rockfall detection from terrestrial LiDAR point clouds: A clustering approach using R. J. Spat. Inf. Sci.
2014, 8, 95–110. [CrossRef]
11. Chen, Z.; Da, F. 3D point cloud simplification algorithm based on fuzzy entropy iteration. Acta Opt. Sinica 2013, 33, 0815001.
[CrossRef]
12. Turk, G. Texture synthesis on surfaces. Proc. Annu. Conf. Comput. Graph. Interact. Tech. SIGGRAPH 2001, 347–354. [CrossRef]
13. Markovic, V.; Jakovljevic, Z.; Miljkovic, Z. Feature Sensitive Three-Dimensional Point Cloud Simplification using Support Vector
Regression. Teh. Vjesn. Tech. Gaz. 2019, 26, 985–994. [CrossRef]
14. Chen, S.; Tian, D.; Feng, C.; Vetro, A.; Kovacevic, J. Contour-enhanced resampling of 3D point clouds via graphs. In Proceedings
of the 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), New Orleans, LA, USA, 5–9
March 2017; pp. 2941–2945.
15. Xia, S.; Wang, R. A Fast Edge Extraction Method for Mobile Lidar Point Clouds. IEEE Geosci. Remote Sens. Lett. 2017, 14, 1288–1292.
[CrossRef]
16. Elkhrachy, I. Feature Extraction of Laser Scan Data Based on Geometric Properties. J. Indian Soc. Remote Sens. 2017, 45, 1–10.
[CrossRef]
17. Li, T.; Pan, Q.; Gao, L.; Li, P. A novel simplification method of point cloud with directed Hausdorff distance. In Proceedings of
the 2017 IEEE 21st International Conference on Computer Supported Cooperative Work in Design (CSCWD), Wellington, New
Zealand, 26–28 April 2017; pp. 469–474.
18. Ji, C.; Li, Y.; Fan, J.; Lan, S. A Novel Simplification Method for 3D Geometric Point Cloud Based on the Importance of Point. IEEE
Access 2019, 7, 129029–129042. [CrossRef]
19. Mahdaoui, A.; Bouazi, A.; Hsaini, A.M.; Sbai, E.H. Entropic Method for 3D Point Cloud Simplification. In Computers and Devices
for Communication; Springer International Publishing: Berlin/Heidelberg, Germany, 2018; pp. 613–621.
20. Yuan, S.; Zhu, S.; Luo, W.; Yu, Z.-Y.; Yuan, L.-W.; Li, D.-S. Feature preserving multiresolution subdivision and simplification of
point clouds: A conformal geometric algebra approach. Math. Methods Appl. Sci. 2018, 41, 4074–4087. [CrossRef]
21. Bernard, F.; Salamanca, L.; Thunberg, J.; Tack, A.; Jentsch, D.; Lamecker, H.; Zachow, S.; Hertel, F.; Goncalves, J.; Gemmar, P.
Shape-aware surface reconstruction from sparse 3D point-clouds. Med. Image Anal. 2017, 38, 77–89. [CrossRef] [PubMed]
22. Zhu, R.; Ma, S.; Xu, D. Guided Filter Simplification Method for Noisy Point Cloud Data. In Proceedings of the 2020 Chinese
Automation Congress (CAC), Xi’an, China, 7–8 November 2020; pp. 6951–6955.
23. Floater, M.S.; Reimers, M. Meshless parameterization and surface reconstruction. Comput. Aided Geom. Des. 2001, 18, 77–92.
[CrossRef]
Symmetry 2021, 13, 399 15 of 15

24. Dey, T.K.; Sun, J. An Adaptive M LS Surface for Reconstruction with Guarantees. In Proceedings of the Third Eurographics
Symposium on Geometry Processing, Vienna, Austria, 4–6 July 2005; pp. 43–52.
25. Wang, H.; Scheidegger, C.; Silva, C. Bandwidth Selection and Reconstruction Quality in Point-Based Surfaces. IEEE Trans. Vis.
Comput. Graph. 2009, 15, 572–582. [CrossRef] [PubMed]

You might also like