Professional Documents
Culture Documents
fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2021.3079257, IEEE Sensors
Journal
IEEE SENSORS JOURNAL, VOL. XX, NO. XX, MONTH X, XXXX 1
XXXX-XXXX © XXXX IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
1530-437X (c) 2021 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
Authorized licensed use limited to: Western Sydney University. Downloaded on June 15,2021 at 11:11:18 UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2021.3079257, IEEE Sensors
Journal
2 IEEE SENSORS JOURNAL, VOL. XX, NO. XX, MONTH X, XXXX
proposed a new method utilizing the Kalman Filter and joint constrained the application of LiDAR to generate the
probabilistic data association filter to track the vehicles and high-resolution vehicle trajectories. The LiDAR can work day
estimate the vehicles’ speed. Kim and Park [41] proposed an and night without the influence of light conditions [52]. The
extended Kalman filter (EKF) reflecting the distance researchers also found solutions to improve the performance of
characteristics of lidar and radar sensors. This method can LiDAR under adverse weather, such as rain and snow [50].
produce accurate distance estimations by sensor fusion. Wu et However, there are very limited studies addressing the
al. [42] proposed a systematic procedure for vehicle tracking occlusion issue for the roadside LiDAR [53]. Thornton et al.
using the roadside LiDAR sensors. The procedure can be [54] used a rule-based method to identify a partially occluded
divided into 5 parts. And the filed test was conducted to vehicle in the parking lot. For example, any object with a length
evaluate the proposed method’s validation. Vimal Kumar A. R. less than a predefined threshold, such as 2.5 meters, was
et al. [43] employed a low-density solid-state flash lidar for considered an occluded vehicle. Lee and Coifman [22] detected
collecting sparse data. The JPDA algorithm and Kalman filter the occlusion by checking whether the background curve can be
were employed to extract vehicles’ trajectories. The results seen between a given pair of vehicles. If not, the farther vehicle
showed that the vehicles’ trajectories can be extracted well is suspected of being occluded. However, those pioneer studies
using the proposed method. can only work for specific sites and are not transferable.
Another critical problem here is how to represent the location Though some other previous studies also stated that the
of the vehicle for the association. Representative points and occlusion issue can be eliminated by setting up multiple
bounding boxes are the two widely used methods to represent LiDARs at different directions [55], [57], [57], [58], the extra
the vehicle location. Since the shape of the vehicle varies with cost of adding and maintaining the LiDARs make this approach
different distances and directions to the roadside LiDAR, the also unimplementable. Recently, Zhao et al. [25] proposed a
bounding box could not effectively represent the location of the Kalman filter-based method to fix the occlusion issue. At one
vehicle [44], [43]. Coifman et al. [45] firstly used the corner timestamp i, the position of the object is estimated based on the
points to track the vehicles. The results showed that using a historical information from its previous frame i-1. If the object
corner point for vehicle tracking can greatly improve the is not detected in the current frame, then the algorithm will
tracking accuracy compared to using the bounding box. Later continuously search the object in the next 1.5 seconds by using
Wu [46] and Sun et al. [47] also found that compared to using the speed in frame i-1. If the object is not detected in 1.5
the center point of the vehicle, using corner points can reduce seconds, the tracking algorithm will stop. The limitation of this
the speed error. The vehicle trajectory can then be generated. method is that this algorithm could generate high-speed errors
However, the occlusion issue is a big challenge for vehicle if the object is chopped into several parts due to the occlusion
tracking. Based on the degree of occlusion, occlusion can be since the Kalman filter-based algorithm will always link the
divided into partial occlusion and full occlusion [48], [49], [50]. nearest point in the next frame. Therefore, generating
A lot of studies have been conducted to address the occlusion high-resolution vehicle trajectories that can overcome
challenge in image detection. Jung and Ho [2] developed a occlusion is still an open issue.
vehicle tracking algorithm considering occlusion reasoning for
video detection. When occlusion is detected, the algorithm can III. DATA AND METHODS
create a new trajectory and link the new trajectory to each
trajectory when there are no occlusions. The testing results A. Tracking Point Switching Detection And Fixing
showed that 5~10 km/h speed error was observed. Zhang et al. In order to extract the vehicles’ trajectories from the roadside
[27] developed a unique framework to detect and handle object LiDAR data continuously and accurately, how to choose a
occlusion. The compactness ratio and interior distance ratio of representative tracking point is crucial. However, the current
the objects were used for occlusion detection on the intraframe algorithm uses the nearest point as the tracking point to
level and subtractive clustering on motion vectors was applied associate the vehicle at different frames, different parts of the
on the interframe level. The total detection rate of occlusion can vehicle can inevitably be selected as the tracking point. Figure 2
reach 93.87% and 100% for partial occlusion and full occlusion, shows an example where the tracking point switched from the
respectively. front corner to point A in the middle of the vehicle and then to
Though the occlusion detection for image processing is the back corner later. As a result, the speed calculation can be
relatively mature, the performance of cameras can be greatly inaccurate.
influenced by harsh environmental conditions, such as weak
light, rain, and snow [51] which limit the field of view and
1530-437X (c) 2021 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
Authorized licensed use limited to: Western Sydney University. Downloaded on June 15,2021 at 11:11:18 UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2021.3079257, IEEE Sensors
Journal
2 IEEE SENSORS JOURNAL, VOL. XX, NO. XX, MONTH X, XXXX
tolerance value ϒ in the angle is given to β. The back corner and Max DVL in the trajectory
(though the corner point B may not be visible in the LiDAR Tracking point is
data). Within NV (except point m), each point together with β not the corner
point
X-axis
point m can create a line. The slope between different lines can q
Adjusted Tracking Point
Quadrant 2
Quadrant 1
two points that created the max β in equation (4), then the X-axis
1530-437X (c) 2021 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
Authorized licensed use limited to: Western Sydney University. Downloaded on June 15,2021 at 11:11:18 UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2021.3079257, IEEE Sensors
Journal
2 IEEE SENSORS JOURNAL, VOL. XX, NO. XX, MONTH X, XXXX
B. Occlusion Issue Detection And Fixing represented by the angle between the Y-axis and the line
Based on the type of occluding object, the occlusion can be created by the tracking point in frame i and i-1.
divided into two types: background occluding issue and road (1) Full Occlusion
user occluding issue. The background occluding issue refers to It is assumed that there are E vehicles in frame i-1 and F
the situation where the vehicle is blocked by the background vehicles in frame i. If there is one vehicle that is visible in
such as trees or buildings. The occlusion area is fixed since the frame i-1 and fully occluded in frame i, then E>F. Given the
location of the LiDAR and the background is fixed. Figure 4 detection range of the LiDAR as R (R is the effective detection
shows an example of the occlusion issue. radius of the LiDAR), if
Vehicle A Vehicle B X 2i −1 + Y 2i −1 R (6)
a b
Object 1 Then there should be one vehicle G in frame i-1 that could
not be associated with any vehicle in frame i. A prediction
algorithm is developed to estimate the position of the vehicle. It
LiDAR
should be mentioned that the prediction algorithm will not
export the location of the vehicle in frame i if the vehicle is not
Vehicle D
Vehicle F
detectable. But the prediction algorithm will store the current
speed, lane information, and moving direction in frame i-1.
Vehicle E
1530-437X (c) 2021 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
Authorized licensed use limited to: Western Sydney University. Downloaded on June 15,2021 at 11:11:18 UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2021.3079257, IEEE Sensors
Journal
2 IEEE SENSORS JOURNAL, VOL. XX, NO. XX, MONTH X, XXXX
TABLE I
THE LIDAR SPECIFICATIONS
Occlusion
Area Indicator Value
Final
Tracking Laser beams 32
DVL i Point Range 40cm~200m
VLi-1 Tracking VLi-1 Range resolution +/- 3cm
Point
Scan FOV 40°×360°
Frame i-1 Frame i Vertical angle resolution 0.33°
Rotation rate 300/600/1200(r/min)
Laser wavelength 905nm
Size 114mm(diameter)*108.73mm(height)
Working temperature -30℃~ 60℃
Fig. 5. Tracking Point Occlusion under Partial Occlusion Weight 1.17kg
If DVLi >= VLi-1, this bias could not be fixed since the vehicle In addition, the roadside LiDAR means the LiDAR deployed
length information cannot be used to further adjust the location in a stationary location along the roadside (shown in Fig.1),
of the tracking point. which is different from the mobile LiDAR (shown in Fig.1) and
The second situation is that the visible part of the vehicle is the airborne LiDAR [41], [44]. As for the LiDAR type, there
small and the cluster is classified as a non-vehicle road user are rotating LiDAR, flash LiDAR, and solid-state LiDAR [45].
(bicycle/pedestrian). Here we assumed that the non-vehicle This paper focused on rotating LiDAR.
road user would not use the vehicle lane. Therefore, if a
pedestrian suddenly appeared in the vehicle lane out of the
detection boundary (such as 5 meters within the boundary) of
the LiDAR, then this must be an occluded vehicle. With this
assumption, the type of road user is changed to a vehicle. Then
the vehicle can be continuously tracked.
If the vehicle is chopped into two parts, only the part closest (1) (2)
to the vehicle in the last frame can be tracked and the other part
will be assigned to a new ID. The location and the frame ID
(time information) are used to judge whether the two IDs
belong to the same vehicle. The detailed algorithm is
documented in Figure 6.
The vehicles’ trajectories can be divided into two parts in this
paper. The first part is Tracking Point Switching Detection and (3)
Fixing elaborating the method about how to choose a Fig. 1. Three types of LiDAR installation. (1)Roadside LiDAR fixed
installation; (2) Roadside LiDAR portable installation; (3)Mobile LiDAR
representative vehicle tracking point in detail. The second part
is Occlusion Issue Detection and Fixing. This part clarified the B. Selected Sites For Evaluation
solutions for associating the unrelated vehicles due to the The proposed method was evaluated by processing the
occlusion. Therefore, the first part is the basement of the second roadside LiDAR data collected at four sites. Figure 7
part. demonstrates the locations of the selected sites. The first
location was selected at one intersection. The installation of
LiDAR was shown in Fig.7. There are two traffic signs between
the location of the LiDAR and the scanned road area. As a
result, there are two occlusion areas on the road. The second
location was selected at a road segment in front of a high school.
The LiDAR was installed on a tripod on the road median for
temporary data collection. Due to the high student volume, the
vehicles need to stop at this site waiting for pedestrians crossing
the road. The third location was selected at one location along
the national highway G104. There is mixed traffic (a lot of
Fig. 6. Merging Two IDs Belonging to the Same Vehicle commercial trucks) traveling on the national highway. The
forth location was selected at the on-ramp area on an overpass.
IV. TESTING RESULTS AND DISCUSSION The LiDAR was also installed on a tripod. The occlusion issue
that one vehicle is blocked by another one is common when the
A. Data Collection
traffic was converging at this site. As a result, the occlusion
In this paper, the RS-LiDAR-32 was employed for data issue is also common at this site. as shown in Fig.7.
collection. The LiDAR has 32 channels with scan angles of
-25° to +15° in the vertical direction and 0° to 360° in the
horizontal direction. More detailed information of
RS-LiDAR-32 is shown in table 1.
1530-437X (c) 2021 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
Authorized licensed use limited to: Western Sydney University. Downloaded on June 15,2021 at 11:11:18 UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2021.3079257, IEEE Sensors
Journal
2 IEEE SENSORS JOURNAL, VOL. XX, NO. XX, MONTH X, XXXX
120 120
Vehicle ID 1 Vehicle ID 1
Vehicle ID 2
100 100
80 80
Speed (mph)
Speed (mph)
60 60
40 40
20 20
0
0
0 10 20 30 40 50 60 70
0 10 20 30 40 50 60 70
Frame ID
Frame ID
(1) (2)
Fig. 9. Example of Tracking Point Location Adjustment Before-and-After Speed Distribution: (1)Speed Distribution Before Adjustment;(2)
Speed Distribution After Adjustment
1530-437X (c) 2021 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
Authorized licensed use limited to: Western Sydney University. Downloaded on June 15,2021 at 11:11:18 UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2021.3079257, IEEE Sensors
Journal
2 IEEE SENSORS JOURNAL, VOL. XX, NO. XX, MONTH X, XXXX
D. Compared To The State-of-the-art Method discrete feature of point clouds the ϒ should be larger than 0°.
So ϒ is suggested to be 30°in this paper.
Based on the authors’ best knowledge, the method proposed
There are still some errors found in the results after applying
by Zhao et al. [25] represents the most advanced method. This
the proposed algorithm. Then the raw LiDAR data were
paper evaluated the proposed occlusion fixing method by
checked in RSView for diagnosis. If there is a long vehicle
comparing with the method documented in paper [25]. The
(such as a commercial truck) entering the road and it is blocked
vehicle tracking algorithm developed by Cui et al. [3] (without
by another object and the tracking starts by coincidence, the
considering the occlusion) was selected as a reference method.
vehicle may be chopped into two dispersed parts. Since the
These three algorithms were applied to process the same
frame i-1 is available, there will be two IDs reported. As a result,
30-minute data collected from four different sites. Table 2
there may be two IDs existing in the record until the occlusion
showed the results after processing.
TABLE 2
disappeared, as shown in Figure 10.
DATA PROCESSING RESULTS WITH DIFFERENT METHODS
Percentage
NDVT*
NDVT* of fixed Percentage of
with the NDVT*
with the occlusion fixed
method with the
Sit method with the occlusion with
develope proposed-b
e develope method the
d by ased
d by Cui developed pro-posed-bas
Zhao et method
et al.[3] by Zhao et ed method
al.[25]
al.[25]
1 50 35 30.0% 3 94.0%
2 66 39 40.9% 4 89.7%
3 168 97 42.26% 5 97.0%
4 45 22 51.11% 3 93.3%
*NDVT: number of detected trajectories.
It was shown that the proposed method can fix more Fig. 10. Long Chopped Truck
disconnected trajectories compared to the method developed by
Zhao et al. More than 89% of disconnected trajectories can be Another issue is a pickup with a trailer. Since the connection
integrated with the proposed method, while only 52% of part between the vehicle and the trail may not be detected by
integration for disconnected trajectories can be realized by LiDAR, the vehicle may be detected as two-vehicle records, as
Zhao et al. Also testing results showed significant difference shown in Figure 11.
with changing sites through the method developed by Zhao et al.
This method only fixed 30.0% of disconnected trajectories in
site 1 and fixed 51.11% of disconnected trajectories in site 4.
However, the proposed method nearly had the same high
performance in different sites, which verifies the scalability of
our algorithm using in different scenarios.
Besides the performance of fixing the disconnected vehicles’ Raw LiDAR
X-axis (m)
Clustering
0
Y-axis (m)
-20
-30
Trajectory
among others. The detection range in [25] only reached 30m (in
one direction) due to the thin point clouds when away from the Fig. 11. Pickup with a Trailer
LiDAR sensor. Thin point clouds lead to the loss of tracking
points and reduce the accuracy of vehicle tracking greatly. The The third issue is the long-time occlusion. One important
proposed method in this paper can switch the tracking point by assumption of the proposed method is that the occluded object
creating a virtual point that guarantees the continuity and can be detected again after some time. If the occlusion
accuracy of the tracking point. Besides, this method can ensure continuously occurs and the occluded object does not show up
the continuity of vehicle's trajectories according to the distance again, then the algorithm stops tracking at the beginning of the
between adjacent frames. Comparation between the two occlusion, as shown in Figure 12.
algorithms showed that the effective detection range of ours can
reach about 60m, which is two times farther than Zhao’s.
E. Parameter Analysis and Error Diagnosis
The experimental results showed that if the value of ϒ
became larger, the tracking point was easily to be located at the
length direction or width direction. Therefore, the larger value
of ϒ had a negative impact on treating the corner point as the
tracking point. And if the value of ϒ became smaller, it can
precisely locate the corner point as the tracking point in theory.
However, due to the unsmooth surface of the vehicle and the (1)
1530-437X (c) 2021 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
Authorized licensed use limited to: Western Sydney University. Downloaded on June 15,2021 at 11:11:18 UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2021.3079257, IEEE Sensors
Journal
2 IEEE SENSORS JOURNAL, VOL. XX, NO. XX, MONTH X, XXXX
Fig. 12. Pickup with a Trailer Truck Occluded By Another One: (1)
Frame 1899;(2) Frame 1926
(2)
V. CONCLUSION
Original Tracking Point
This paper developed a novel method to extract the vehicle
Adjusted Tracking Point
trajectories considering the influence of object occlusion issues
(both partial occlusion and full occlusion) as well as the point
switching issue. The proposed method was evaluated at four
actual sites. Compared to the state-of-the-art methods, the
proposed method can greatly improve the accuracy of the
Fig. 13. Vehicle passing X=0 & Y>0 in one frame vehicle trajectories by automatically merging the disconnected
ones. In a brief word, Our contributions are summarized as
Another special issue is shown in Figure 13. It can be seen follows. (1) The point switching adjustment algorithm via
that when the vehicle passed the line-X=0, an additional part creating a virtual point was proposed to fix the tracking point
out of the vehicle was generated. This was caused by the switching issue, making it feasible to track the vehicles. (2) The
mechanical features of the rotating LiDAR. The LiDAR occlusion detecting and fixing algorithm was proposed to
considered the line X=0 with Y>= 0 as the starting and ending extract the vehicle trajectories under full occlusion and partial
location of one frame. When a vehicle is close to X=0, in one occlusion. (3) The developed multiple vehicles’ trajectories
frame, the vehicle may be scanned twice in the same frame. extraction algorithm outperformed state-of-art methods on four
Then the distance between the first scanned part and the second different experimental sites with fixing at least 89% of
scanned part can be estimated based on the speed and the disconnected trajectories. This effort can benefit a lot of
location of the vehicle, as shown in equation (8). transportation areas such as traffic volume count, vehicle speed
v tracking, adaptive traffic signal timing, and traffic safety
GH if Disoff = − d − DVL (9)
analysis [59].
F
Where Disoff means the distance offset between the first Of course, it should be mentioned that there are several
scanned part and the second scanned part, v is the vehicle speed. special cases that the proposed method cannot handle. How to
F is the rotating frequency of the vehicle. d is the distance track vehicles under those special scenarios is a topic for future
between the vehicle when it is first scanned. With the studies. The previous studies found that the use of multiple
traditional GNN, another vehicle ID will be assigned to the first sensors can provide better performances in object tracking.
main body of the vehicle. Though the proposed method can This research purely focused on vehicle detection and tracking
merge two object IDs into one, the speed at this frame may be using the roadside LiDAR sensor. Considering the camera and
exaggerated. The total length of the vehicle can be larger than radar sensors, the accuracy may also be available for some sites,
the normal length of the other, depending on the real moving and it will be interesting to integrate the data collected from
speed of the first vehicle. For this issue, the speed variance at different types of traffic sensors to further improve the tracking
this frame increased with the speed of the vehicle. This issue accuracy. performances in object tracking [37].
could not be solved by the current algorithm.
[2] Jung, Y.K. and Ho, Y.S., 1999, October. Traffic parameter extraction
using video-based vehicle tracking. In Proceedings 199 IEEE/IEEJ/JSAI
REFERENCES International Conference on Intelligent Transportation Systems (Cat. No.
99TH8383), pp. 764-769.
[1] Wu, J., Xu, H., Zheng, Y. and Tian, Z., 2018. A novel method of
[3] Cui, Y., Xu, H., Wu, J., Sun, Y. and Zhao, J., 2019. Automatic Vehicle
vehicle-pedestrian near-crash identification with roadside LiDAR data.
Tracking with Roadside LiDAR Data for the Connected-Vehicles System.
Accident Analysis & Prevention, 121, pp.238-249.
IEEE Intelligent Systems, 34 (3), pp.44-51.
1530-437X (c) 2021 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
Authorized licensed use limited to: Western Sydney University. Downloaded on June 15,2021 at 11:11:18 UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2021.3079257, IEEE Sensors
Journal
8 IEEE SENSORS JOURNAL, VOL. XX, NO. XX, MONTH X, XXXX
[4] Zhao, J., Xu, H., Wu, J., Zheng, Y. and Liu, H., 2018. Trajectory tracking [26] Galceran, E., Olson, E. and Eustice, R.M., 2015, September. Augmented
and prediction of pedestrian's crossing intention using roadside LiDAR. vehicle tracking under occlusions for decision-making in autonomous
IET Intelligent Transport Systems, 13(5), pp.789-795. driving. In 2015 IEEE/RSJ International Conference on Intelligent
[5] Bhaskar, P.K. and Yong, S.P., 2014, June. Image processing based Robots and Systems (IROS), pp. 3559-3565.
vehicle detection and tracking method. In 2014 IEEE International [27] Zhang, W., Wu, Q.J., Yang, X. and Fang, X., 2008. Multilevel framework
Conference on Computer and Information Sciences (ICCOINS), pp. 1-5. to detect and handle vehicle occlusion. IEEE Transactions on Intelligent
[6] Lv, B., Xu, H., Wu, J., Tian, Y., Zhang, Y., Zheng, Y., Yuan, C. and Tian, Transportation Systems, 9(1), pp.161-174.
S., 2019. LiDAR-Enhanced Connected Infrastructures Sensing and [28] Kamijo, S., Matsushita, Y., Ikeuchi, K. and Sakauchi, M., 2000,
Broadcasting High-Resolution Traffic Information Serving Smart Cities. September. Occlusion robust tracking utilizing spatio-temporal markov
IEEE Access, 7, pp.79895-79907. random field model. In Proceedings 15th International Conference on
[7] Guan, H., Xingang, W., Wenqi, W., Han, Z. and Yuanyuan, W., 2016, Pattern Recognition. ICPR-2000, 1, pp. 140-144.
May. Real-time lane-vehicle detection and tracking system. In IEEE 2016 [29] Lou, J., Tan, T., Hu, W., Yang, H. and Maybank, S.J., 2005. 3-D
Chinese Control and Decision Conference (CCDC), pp. 4438-4443. model-based vehicle tracking. IEEE Transactions on image processing,
[8] Xu, W., Wei, J., Dolan, J.M., Zhao, H. and Zha, H., 2012, May. A 14(10), pp.1561-1569.
real-time motion planner with trajectory optimization for autonomous [30] Wu, J., Xu, H. and Zheng, J., 2017, October. Automatic background
vehicles. In 2012 IEEE International Conference on Robotics and filtering and lane identification with roadside LiDAR data. In 2017 IEEE
Automation, pp. 2061-2067. 20th International Conference on Intelligent Transportation Systems
[9] Ma, Y., Wu, X., Yu, G., Xu, Y. and Wang, Y., 2016. Pedestrian detection (ITSC), pp. 1-6.
and tracking from low-resolution unmanned aerial vehicle thermal [31] Wu, J., Xu, H. and Zhao, J., 2018. Automatic lane identification using the
imagery. Sensors, 16(4):446. roadside LiDAR sensors. IEEE Intelligent Transportation Systems
[10] Wu, J. and Xu, H., 2017. Driver behavior analysis for right-turn drivers at Magazine, in press.
signalized intersections using SHRP 2 naturalistic driving study data. [32] Wu, J., Xu, H., Sun, Y., Zheng, J. and Yue, R., 2018. Automatic
Journal of safety research, 63, pp.177-185. background filtering method for roadside LiDAR data. Transportation
[11] Wang, Q., Zheng, J., Xu, H., Xu, B. and Chen, R., 2017. Roadside Research Record, 2672(45), pp.106-114.
magnetic sensor system for vehicle detection in urban environments. [33] Wu, J., 2018. An automatic procedure for vehicle tracking with a roadside
IEEE Transactions on Intelligent Transportation Systems, 19(5), LiDAR sensor. Institute of Transportation Engineers. ITE Journal, 88(11),
pp.1365-1374. pp.32-37.
[12] Sun, Y., Xu, H., Wu, J., Hajj, E.Y. and Geng, X., 2017. Data processing [34] Chavez-Garcia, R.O. and Aycard, O., 2015. Multiple sensor fusion and
framework for development of driving cycles with data from SHRP 2 classification for moving object detection and tracking. IEEE
Naturalistic Driving Study. Transportation Research Record, 2645(1), Transactions on Intelligent Transportation Systems, 17(2), pp.525-534.
pp.50-56. [35] Song, T.L., Kim, H.W. and Musicki, D., 2015. Iterative joint integrated
[13] Zhou, X., Tanvir, S., Lei, H., Taylor, J., Liu, B., Rouphail, N.M. and Frey, probabilistic data association for multitarget tracking. IEEE Transactions
H.C., 2015. Integrating a simplified emission estimation model and on Aerospace and Electronic Systems, 51(1), pp.642-653.
mesoscopic dynamic traffic simulator to efficiently evaluate emission [36] Chen, X., Li, Y., Li, Y., Yu, J. and Li, X., 2016. A novel probabilistic data
impacts of traffic management strategies. Transportation Research Part D: association for target tracking in a cluttered environment. Sensors,
Transport and Environment, 37, pp.123-136. 16(12):2180.
[14] Zhao, J., Li, Y., Xu, H. and Liu, H., 2019. Probabilistic Prediction of [37] Choi, J., Ulbrich, S., Lichte, B. and Maurer, M., 2013, October.
Pedestrian Crossing Intention Using Roadside LiDAR Data. IEEE Access, Multi-target tracking using a 3d-lidar sensor for autonomous vehicles. In
7, pp.93781-93790. 16th International IEEE Conference on Intelligent Transportation
[15] Mensing, F., Bideaux, E., Trigui, R. and Tattegrain, H., 2013. Trajectory Systems (ITSC 2013), pp. 881-886.
optimization for eco-driving taking into account traffic constraints. [38] Li, R., Zhao, Y., Chen, J., Zhou, S., Xing, H. and Tao, Q., 2018, June.
Transportation Research Part D: Transport and Environment, 18, Target Detection Algorithm Based on Chamfer Distance Transform and
pp.55-61. Random Template. In 2018 IEEE 3rd International Conference on Image,
[16] Chen, J., Tian, S., Xu, H., Yue, R., Sun, Y. and Cui, Y., 2019. Vision and Computing (ICIVC), pp. 106-112.
Architecture of Vehicle Trajectories Extraction with Roadside LiDAR [39] Liu, S., Zheng, J., Wang, X., Zhang, Z. and Sun, R., 2019, June. Target
Serving Connected Vehicles. IEEE Access, 7, pp. 100406-100415. Detection from 3D Point-Cloud using Gaussian Function and CNN. In
[17] Feng, Y., Head, K.L., Khoshmagham, S. and Zamanipour, M., 2015. A 2019 34rd Youth Academic Annual Conference of Chinese Association
real-time adaptive signal control in a connected vehicle environment. of Automation (YAC), pp. 562-567.
Transportation Research Part C: Emerging Technologies, 55, pp.460-473. [40] Zhang J, Xiao W, Coifman B, et al. Vehicle Tracking and Speed
[18] Yu, C., Feng, Y., Liu, H.X., Ma, W. and Yang, X., 2018. Integrated Estimation From Roadside Lidar. IEEE Journal of Selected Topics in
optimization of traffic signals and vehicle trajectories at isolated urban Applied Earth Observations and Remote Sensing, 2020, 13: 5597-5608.
intersections. Transportation Research Part B: Methodological, 112, [41] Kim T, Park T H. Extended Kalman filter (EKF) design for vehicle
pp.89-112. position tracking using reliability function of radar and lidar. Sensors,
[19] Chen, J., Xu, H., Wu, J., Yue, R., Yuan, C. and Wang, L., 2019. Deer 2020, 20(15): 4126.
Crossing Road Detection with Roadside LiDAR Sensor. IEEE Access, 7, [42] J. Wu, Y. Zhang, Y. Tian, R. Yue, and H. Zhang. Automatic Vehicle
pp.65944-65954. Tracking with LiDAR-Enhanced Roadside Infrastructure. Journal of
[20] Zheng, J., Yang, S., Wang, X., Xia, X., Xiao, Y. and Li, T., 2019. A Testing and Evaluation, 2021, 49, 121-133.
Decision Tree based Road Recognition Approach using Roadside Fixed [43] A. R., V.K., S.C. Subramanian, and R. Rajamani, On Using a
3D LiDAR Sensors. IEEE Access, 7, pp.53878-53890. Low-Density Flash Lidar for Road Vehicle Tracking. Journal of Dynamic
[21] Wu, J., Tian, Y., Xu, H., Yue, R., Wang, A. and Song, X., 2019. Systems, Measurement, and Control, 2021. 143(8).
Automatic ground points filtering of roadside LiDAR data using a [44] Zhang, Z.Y., Zheng, J., Wang, X. and Fan, X., 2018, July. Background
channel-based filtering algorithm. Optics & Laser Technology, 115, Filtering and Vehicle Detection with Roadside Lidar Based on Point
pp.374-383. Association. In 2018 37th Chinese Control Conference (CCC), pp.
[22] Lee, H. and Coifman, B., 2012. Side-fire lidar-based vehicle classification. 7938-7943.
Transportation Research Record, 2308(1), pp.173-183. [45] Coifman, B., Beymer, D., McLauchlan, P. and Malik, J., 1998. A
[23] Gutjahr, B., Gröll, L. and Werling, M., 2016. Lateral vehicle trajectory real-time computer vision system for vehicle tracking and traffic
optimization using constrained linear time-varying MPC. IEEE surveillance. Transportation Research Part C: Emerging Technologies,
Transactions on Intelligent Transportation Systems, 18(6), pp.1586-1595. 6(4), pp.271-288.
[24] Wan, N., Vahidi, A. and Luckow, A., 2016. Optimal speed advisory for [46] Wu, J., 2018. An automatic procedure for vehicle tracking with a roadside
connected vehicles in arterial roads and the impact on mixed traffic. LiDAR sensor. Institute of Transportation Engineers. ITE Journal, 88(11),
Transportation Research Part C: Emerging Technologies, 69, pp.548-563. pp.32-37.
[25] Zhao, J., Xu, H., Liu, H., Wu, J., Zheng, Y. and Wu, D., 2019. Detection [47] Sun, Y., Xu, H., Wu, J., Zheng, J. and Dietrich, K.M., 2018. 3-D data
and tracking of pedestrians and vehicles using roadside LiDAR sensors. processing to extract vehicle trajectories from roadside LiDAR data.
Transportation research part C: emerging technologies, 100, pp.68-87. Transportation research record, 2672(45), pp.14-22.
1530-437X (c) 2021 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
Authorized licensed use limited to: Western Sydney University. Downloaded on June 15,2021 at 11:11:18 UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2021.3079257, IEEE Sensors
Journal
8 IEEE SENSORS JOURNAL, VOL. XX, NO. XX, MONTH X, XXXX
[48] Wu, J., Xu, H. and Zheng, J., 2017, October. Automatic background
filtering and lane identification with roadside LiDAR data. In 2017 IEEE
20th International Conference on Intelligent Transportation Systems
(ITSC), pp. 1-6.
[49] Wu, J., Xu, H. and Zhao, J., 2018. Automatic lane identification using the
roadside LiDAR sensors. IEEE Intelligent Transportation Systems
Magazine, in press.
[50] Wu, J., Xu, H., Zhao, J., and Zheng, J., 2021. Automatic Vehicle
Detection with Roadside LiDAR Data under Rainy and Snowy
Conditions. IEEE Intelligent Transportation Systems Magazine, 13(1),
pp.197-209.
[51] Vaquero, V., del Pino, I., Moreno-Noguer, F., Sola, J., Sanfeliu, A. and
Andrade-Cetto, J., 2017, September. Deconvolutional networks for
point-cloud vehicle detection and tracking in driving scenarios. In 2017
IEEE European Conference on Mobile Robots (ECMR), pp. 1-7.
[52] Li, B., Zhang, T. and Xia, T., 2016. Vehicle detection from 3d lidar using
fully convolutional network. arXiv preprint arXiv:1608.07916.
[53] Wu, J., Xu, H., Zheng, Y., Zhang, Y., Lv, B. and Tian, Z., 2019.
Automatic Vehicle Classification using Roadside LiDAR Data.
Transportation Research Record, 2673 (6), 153-164.
[54] Thornton, D.A., Redmill, K. and Coifman, B., 2014. Automated parking
surveys from a LIDAR equipped vehicle. Transportation research part C:
emerging technologies, 39, pp.23-35.
[55] Yue, R., Xu, H., Wu, J., Sun, R. and Yuan, C., 2019. Data Registration
with Ground Points for Roadside LiDAR Sensors. Remote Sensing,
11(11): 1354.
[56] Lv, B., Xu, H., Wu, J., Tian, Y., Tian, S. and Feng, S., 2019. Revolution
and rotation-based method for roadside LiDAR data integration. Optics &
Laser Technology, 119, p.105571.
[57] Lv, B., Xu, H., Wu, J., Tian, Y. and Yuan, C., 2019. Raster-based
Background Filtering for Roadside LiDAR Data. IEEE Access, 7 (1),
76779 - 76788.
[58] Wu, J., Xu, H. and Liu, W., 2019. Points Registration for Roadside
LiDAR Sensors. Transportation Research Record, 2673(9), pp.627-639.
[59] Zheng, Y., Zhang, Y. and Li, L., 2016. Reliable path planning for bus
networks considering travel time uncertainty. IEEE Intelligent
Transportation Systems Magazine, 8(1), pp.35-50.
1530-437X (c) 2021 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
Authorized licensed use limited to: Western Sydney University. Downloaded on June 15,2021 at 11:11:18 UTC from IEEE Xplore. Restrictions apply.