You are on page 1of 68

TELE ROBOTICS THROUGH 5G

Undergraduate graduation project report submitted in partial fulfillment of


the requirements for the
Degree of Bachelor of Science of Engineering
in

The Department of Electronic & Telecommunication Engineering


University of Moratuwa.

Supervisors: Group Members:


Dr. Kasun Hemachandra Abeysundara A.C (170008X)
Dr. Peshala Jayasekara Jalath H.P (170244P)
Jayasinghe J.A.M.D (170263X)
Opatha O.W.H.P (170416V)

January, 2022
Approval of the Department of Electronic & Telecommunication
Engineering

......................................
Head, Department of Electronic &
Telecommunication Engineering

This is to certify that I/we have read this project and that in my/our opinion it is
fully adequate, in scope and quality, as an Undergraduate Graduation Project.

Supervisor: Dr. Kasun Hemachandra

Signature: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Date: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

i
Declaration
This declaration is made on June 22, 2022.

Declaration by Project Group


We declare that the dissertation entitled Project Name and the work presented in it
are our own. We confirm that:
• this work was done wholly or mainly in candidature for a B.Sc. Engineering
degree at this university,
• where any part of this dissertation has previously been submitted for a degree
or any other qualification at this university or any other institute, has been
clearly stated,
• where we have consulted the published work of others, is always clearly
attributed,
• where we have quoted from the work of others, the source is always given,
• with the exception of such quotations, this dissertation is entirely our own
work,
• we have acknowledged all main sources of help,
• parts of this dissertation have been published. (see List of Publications)

................ ......................................
Date Abeysundara A.C (170008X)

......................................
Jalath H.P (170244P)

......................................
Jayasinghe J.A.M.D (170263X)

......................................
Opatha O.W.H.P (170416V)

ii
Declaration by Supervisor
I/We have supervised and accepted this dissertation for the submission of the
degree.

................................... ...........................
Dr. Kasun Hemachandra Date

................................... ...........................
Dr. Peshala Jayasekara Date

iii
Abstract

TELE ROBOT THROUGH 5G

Group Members: Abeysundara A.C, Jalath H.P, Jayasinghe J.A.M.D,


Opatha O.W.H.P

Supervisors: Dr. Kasun Hemachandra, Dr. Peshala Jayasekara

In this telecommunication era, the world is constantly moving through various gen-
erations of mobile communication technologies. Moving from 3G, 4G, and 5G, one
generation to the next, mobile communication technology is going through a huge
in-built improvement in various characteristics while adding new functions. Tele-
operations is an important field in robotics. From surgical robots to space robots,
they have found applications in different areas. When it comes to telerobotics, to
get the expected performance, a large amount of data is required to be continuously
transferred through a high-speed network. Therefore, telerobotics generally requires a
network with high network bandwidth and a minimum packet loss, a minimum delay,
and a capacity for real-time response. Since URLLC, or Ultra-Reliable Low Latency
Communication, is defined as one of the key uses of 5G technology, it is very suitable
for use cases such as telerobotics. This paper discusses the development of a simple
telerobot that uses 5G technology to establish communication between the user and
the robot. We are mainly focused on avoiding the typical problems of currently used
wireless or wired networks by using a 5G network. The end solution described in
the paper focuses on establishing a telerobotic system with a minimum latency while
ensuring the other QoS parameters such as packet loss.
The remaining section of this report is organized as follows: in section II we give
an overall introduction to the project and the related work of telerobotic applications
under the different types of network conditions; in section III we present the system
architectural design of 5G robotic system we have deployed supported by SDN and
Network Function Virtualization (NFV) also web application design and robot im-
plementation; in section. IV, describe a comparative study of 4G and 5G analysis by
key performance indicator. In section VII, we have highlighted results and discussion
of suggested architecture; finally, we draw our conclusion in section VIII.

iv
Dedication

Firstly, we dedicate our Final Year Project, to the Department of Electronic and
Telecommunication Engineering, University of Moratuwa, the place where we gained
the knowledge and skills to make our project a success.
Then we dedicate our project to our dear lecturers who constantly supported,
guided, and encouraged us to apply the best of our abilities to the accomplishment
of our project.
Furthermore, we dedicate this project to our parents and teachers who have made
immense sacrifices to bring us to the position where we are today.
Finally, we would like to dedicate our project to the general community who wish
to apply their knowledge to solve real-world problems by coming up with innovative
solutions for the betterment of society.

v
Acknowledgements

First, We would like to express our sincere gratitude to our supervisors, Dr. Kasun
Hemachandra and Dr. Peshala Jayasekara, for their continuous guidance, support,
and commitment to the success of this project.
We are thankful to our Head of the Department Dr. Ranga Rodrigo, final year
project lecturer coordinator Dr. Mevan Gunawardena and all other lecturers in our
department who gave valuable comments and suggestions during presentations and
helped us to improve our results of this project.
We extend our gratitude to our external collaborators Eng. Shamil Dilshan
Premathunga, Eng. Pasan Dharmasiri from Dialog innovation Lab - University of
Moratuwa for the eminence support given throughout the project.
We would like to express our sincere gratitude to Ms. Srianthie Salgado for her
valuable advice to make better reports in a professional style for our project.
Finally, we are being thankful to our dear parents for making numerous sacrifices
and contributions to make our project a success within the given time frame and we
thank all those who directly and indirectly helped us to make this project a success.

vi
Contents

Declaration ii

Declaration by Supervisor iii

Abstract iv

Dedication v

Acknowledgements vi

Contents viii

Acronyms and Abbreviations xii

1 Introduction 1
1.1 Nature and the scope of the problem . . . . . . . . . . . . . . . . . . 1
1.1.1 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1.2 Proposed Solution . . . . . . . . . . . . . . . . . . . . . . . . 2
1.1.2.1 Why use 5g? . . . . . . . . . . . . . . . . . . . . . . 2
1.1.3 Project Architecture . . . . . . . . . . . . . . . . . . . . . . . 3
1.1.4 Primary Objectives . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2 Literature survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.2.1 Telerobot networks types . . . . . . . . . . . . . . . . . . . . . 5
1.2.2 Video Encoding Methods . . . . . . . . . . . . . . . . . . . . . 5
1.2.3 Real-time video transmission . . . . . . . . . . . . . . . . . . . 5
1.2.4 Delay and Reliability requirements . . . . . . . . . . . . . . . 6
1.3 Method of investigation . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.3.1 5G Network deployment . . . . . . . . . . . . . . . . . . . . . 7
1.3.1.1 Core Network deployment . . . . . . . . . . . . . . . 7
1.3.1.2 Radio Access network deployment . . . . . . . . . . . 7
1.3.2 Robot Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.3.2.1 Fully Virtual Robot Simulation . . . . . . . . . . . . 7
1.3.2.2 A Life-sized Physical Robot . . . . . . . . . . . . . . 7
1.3.3 Tele-operation Web Application . . . . . . . . . . . . . . . . . 8
1.3.3.1 WebSockets . . . . . . . . . . . . . . . . . . . . . . . 8
1.4 Principal results of the investigation . . . . . . . . . . . . . . . . . . . 8

2 Methodology 9
2.1 5G network deployment . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.1.1 5G core network deployment . . . . . . . . . . . . . . . . . . . 9
2.1.1.1 5G Standalone vs Non-standalone architecture . . . . 9
2.1.1.2 5G core network deployment options . . . . . . . . . 11

vii
2.1.1.3 5g core network functions . . . . . . . . . . . . . . . 12
2.1.1.4 OAI core network deployment . . . . . . . . . . . . . 15
2.1.2 5G Radio Access Network Deployment . . . . . . . . . . . . . 17
2.1.2.1 Virtual Implementation using RAN Simulators . . . 17
2.1.2.2 Implementation with USRP Devices and 5G modules 23
2.2 Teleoperation Application . . . . . . . . . . . . . . . . . . . . . . . . 25
2.2.1 Operator Interface . . . . . . . . . . . . . . . . . . . . . . . . 27
2.3 Robot Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
2.3.1 Objective of the Robot Design . . . . . . . . . . . . . . . . . . 29
2.3.1.1 Basic Maneuvering Capability . . . . . . . . . . . . . 30
2.3.1.2 Real-Time Audio/Video Communication . . . . . . . 30
2.3.1.3 Collision Avoidance . . . . . . . . . . . . . . . . . . . 30
2.3.1.4 Robot Design process . . . . . . . . . . . . . . . . . 30
2.3.2 Components Used . . . . . . . . . . . . . . . . . . . . . . . . . 31
2.3.2.1 Raspberry Pi 4B . . . . . . . . . . . . . . . . . . . . 31
2.3.2.2 Pololu 37D Metal Gear Motors . . . . . . . . . . . . 32
2.3.2.3 Dangaya 2.0 Motor Driver . . . . . . . . . . . . . . . 33
2.3.2.4 HC-SR-04 Ultrasonic Sensor . . . . . . . . . . . . . . 33
2.3.2.5 Generic Web Camera . . . . . . . . . . . . . . . . . . 35
2.3.2.6 Voltage regulators . . . . . . . . . . . . . . . . . . . 36
2.3.2.7 SimComm 5G UE Module . . . . . . . . . . . . . . . 37
2.3.3 Building the Robot . . . . . . . . . . . . . . . . . . . . . . . . 38
2.3.3.1 SolidWorks Design . . . . . . . . . . . . . . . . . . . 38
2.3.3.2 PCB Design . . . . . . . . . . . . . . . . . . . . . . . 39
2.3.4 Suggestions for Further Improvements . . . . . . . . . . . . . 41
2.3.4.1 Build a Life-sized Robot . . . . . . . . . . . . . . . . 41
2.3.4.2 Use More Sophisticated Hardware . . . . . . . . . . . 41
2.3.4.3 Test the Robot on a 5G SA Network . . . . . . . . . 41
2.3.5 Alternative Methods . . . . . . . . . . . . . . . . . . . . . . . 41

3 Results 44
3.1 5G network Deployment . . . . . . . . . . . . . . . . . . . . . . . . . 44
3.1.1 Ping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
3.1.1.1 Pining from UE to Ext-dn . . . . . . . . . . . . . . . 44
3.1.1.2 Pinging between UEs . . . . . . . . . . . . . . . . . . 46
3.1.2 Iperf test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
3.2 Teleoperation Application . . . . . . . . . . . . . . . . . . . . . . . . 49

4 Discussion and Conclusion 51

References 52

Appendix 54

viii
List of Figures

1.1 Overall architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

2.1 Non-standalone Vs. Standalone architecture . . . . . . . . . . . . . . 11


2.2 5G service based architecture . . . . . . . . . . . . . . . . . . . . . . 13
2.3 OpenAirInterface 5G core network architecture . . . . . . . . . . . . 16
2.4 Connecting 2 virtual UEs to core network through virtual gNobeBs . 18
2.5 gNodeB configuration file . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.6 Network interfaces in gNobeB . . . . . . . . . . . . . . . . . . . . . . 20
2.7 Command to turn on gNodeB . . . . . . . . . . . . . . . . . . . . . . 20
2.8 UE Configuration file for new UE . . . . . . . . . . . . . . . . . . . . 21
2.9 Data Base Entry related to the new UE . . . . . . . . . . . . . . . . . 21
2.10 Command to turn on virtual UEs using RFsimulator mode . . . . . . 21
2.11 Vittual UE and gNobeB setup . . . . . . . . . . . . . . . . . . . . . . 22
2.12 Architecture with USRP device . . . . . . . . . . . . . . . . . . . . . 23
2.13 Setup with USRP module . . . . . . . . . . . . . . . . . . . . . . . . 24
2.14 WebRTC Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . 25
2.15 Heroku Server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
2.16 TURN Server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
2.17 Operator Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
2.18 Original Conceptual Design of the Robot . . . . . . . . . . . . . . . . 29
2.19 Raspberry Pi 4B Single Board Computer . . . . . . . . . . . . . . . . 32
2.20 Pololu 37D Metal Gear Motor with Encoders . . . . . . . . . . . . . . 32
2.21 Aptinex Dangaya 2.0 Dual Motor Driver . . . . . . . . . . . . . . . . 33
2.22 HC-SR04 Ultrasonic Sensor . . . . . . . . . . . . . . . . . . . . . . . 34
2.23 Distance Measurement Mechanism of Ultrasonic Sensors . . . . . . . 34
2.24 Effective Distance Measurement Range for Ultrasonic Sensors . . . . 35
2.25 Web Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
2.26 XL4015 Buck Converter . . . . . . . . . . . . . . . . . . . . . . . . . 36
2.27 120W Buck Boost Converter . . . . . . . . . . . . . . . . . . . . . . . 36
2.28 SimComm 5G UE Module . . . . . . . . . . . . . . . . . . . . . . . . 37
2.29 Communicating with SimComm 5G UE Module with AT Commands
through a Serial Monitor . . . . . . . . . . . . . . . . . . . . . . . . . 38
2.30 CAD Drawing of the Robot . . . . . . . . . . . . . . . . . . . . . . . 39
2.31 Final Design of the Robot . . . . . . . . . . . . . . . . . . . . . . . . 40
2.32 Sample Control Command JSON Object . . . . . . . . . . . . . . . . 40
2.33 Simulation Enviroment . . . . . . . . . . . . . . . . . . . . . . . . . . 42
2.34 Robot Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
2.35 Camera Feed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

3.1 Ping from UE1 to Ext-dn . . . . . . . . . . . . . . . . . . . . . . . . 44


3.2 Ping from UE2 to Ext-dn . . . . . . . . . . . . . . . . . . . . . . . . 45

ix
3.3 Ping values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
3.4 AMF logs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
3.5 Ping values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
3.6 wireshark analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
3.7 Throughput values . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
3.8 Latency measuring method . . . . . . . . . . . . . . . . . . . . . . . . 49
3.9 Latency comparison between different networks . . . . . . . . . . . . 50

4.1 PCB Schematic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54


4.2 PCB layout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

x
List of Tables

1.1 QoS requirements for diff rent applications . . . . . . . . . . . . . . . 6

2.1 5G Network implementation available options . . . . . . . . . . . . . 11


2.2 Component Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
2.3 Network Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
2.4 Webots vs Gazebo . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

3.1 Iperf test results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

xi
Acronyms and Abbreviations

UE - User Equpement
URLLC - Ultra Reliable Low Latency Communication
USRP - Universal Software Radio Peripheral
SDR - Software Defined Radio
IP - Internet Protocol
TCP - Transmission Control Protocol
UDP - User Datagram Protocol
AMF - Access and Mobility Function
UPF - User Plane Function
SMF - Session Management Function
NRF - Network Repository Function
Ext-dn - External Data Network
SDP - Session Description Protocol
ICE - Interactive Connectivity Establishment

xii
1 Introduction
1.1 Nature and the scope of the problem

Telerobotics is the area of robotics concerned with the control of semi-autonomous


robots from a distance, mainly using Wireless networks (like Wi-Fi, Bluetooth, the
4g mobile networks, and similar) or tethered connections. It is a combination of two
major subfields, teleoperation and telepresence. Tele-robotics mainly consists of 2
main parts.
1. Teleoperation - controlling the robot from a distance location

2. Telepresence - Getting video-audio feedback from the robot hence, the operator
feels present in the remote environment, projecting his or her presence through
the remote robot
Telerobotics is currently used in several fields nowadays. Telerobotic devices are typi-
cally developed for situations or environments that are too dangerous, uncomfortable,
limiting, repetitive, or costly for humans to perform. Telesurgery, Bomb disposal, and
Space exploration are some examples of that.

1.1.1 Problem Statement


A telerobotic system performs perfectly only if the user feels comfortable with the
latency of the system, the lag in the response to movements, and the visual represen-
tation. Any issues such as inadequate resolution, the latency of the video image, lag
in the mechanical and computer processing of the movement and response, and opti-
cal distortion due to camera lens and head-mounted display lenses, can cause the user
’simulator sickness’ that is exacerbated by the lack of vestibular stimulation wia the
visual representation of motion. Curr, only telerobotic systems are developed using
wireless technologies such as 4g, Wifi, Bluetooth, etc. There are several drawbacks
to these technologies,
• Due to less Network speed time-critical tasks are hard to perform

• Audio-video transmission (telepresence) related problems such as higher packet


delays, jitter, packet loss, etc.
Due to these drawbacks, the above-mentioned requirement is hard to fulfill. Through
this project,t we are using 5g technology to avoid such drawbacks and implement
both teleoperation and telepresence.

1
1.1.2 Proposed Solution
To avoid the above-mentioned drawback in this project we are using the 5g technology
to establish the connection between the robot and the control interface. 5g provides
higher data rates than existing mobile networks Due to this reason 5g is a good
solution for high-quality video audio transmission. Since 5g provides higher reliability
and lower latency it is a particularly good solution for the control signal transmission.
Also, 5G provia des higher connection density compared to other networks. Due to
this reason 5g is suitable in use case scenarios where multiple robot units are used.
Network slicing is a unique property of 5G technology which allows the establishment
of different slices for different use cases with different QoS requirements. Using 5G for
telerobotic can be recognized as a UUltra-reliable liable low latency Communication)
use case.

1.1.2.1 Why use 5g?


One of the key requirements of telerobotic systems is to achieve low latency commu-
nication, Following key features that are provided by 5g technology makes it a more
suitable candidate for telerobotic networks compared to its predecessors.

• Ultra-reliable low latency communication (URLLC)


Theoretically, 5G technology aims to reach speeds that are 20 times faster than
4G LTE. Changes made in modulation and authentication techniques as well as
manageability and flexibility of the core network in 5G have contributed towards
ultra-reliable low latency communication in 5G. Hence using a 5G network in
a telerobotic system can provide seamless ultra-low-latency which enables con-
trolling the robot from distance with no significant interruption.

• Network slicing
As per the architecture of our project, there are three types of data transmitted
between the robot and the controller. Which are video, audio signals, and
control signals. 5G technology has introduced a network management feature
called "network slicing" which can facilitate a different type of network traffic
into separate slices with different QoS parameters, so they can be treated in
different ways. This is crucial for the teleoperation scenario as low latency traffic
and high bandwidth traffic can be treated differently with 2 different slices. For
telerobotics, two different slices can be used for teleoperation and telepresence.
For teleoperation we can use an ultra-low latency slice and for while telepresence

2
(video, audio signals) are transmitted a using a high-capacity slice with higher
bandwidth.

1.1.3 Project Architecture

Figure 1.1: Overall architecture

In our solution, the robot and the operator will be connected through a 5G network
ensuring low latency and other QoS parameters. The robot will be teleoperated
from a distance but it will contain some sensors which will enable semi-autonomous
navigation. The sensor data will be processed onboard and it will help the teleoperator
to identify and minimize the collisions. The operator will be able to use a Graphical
User Interface which will be developed by us to control the robot. The interface will
output the live video feedback from the camera installed on the robot and will take
input from the user to control the robot. A speaker will be installed on the robot
to communicate with the customer that the robot will be serving. The component
selection for the robot will be discussed in the latter part of the report.

In the end solution, there will be mainly two links between the robot and tele-
operator as indicated in the above figure. A link will be established to send manipula-
tion commands and audio streams from the teleoperator to the robot. A feedback link
from the robot to the operator will also be established to video and audio feedback

3
from the robot to the teleoperator. Video encoding methods will be used to reduce
the bandwidth requirement of the feedback link.

1.1.4 Primary Objectives


The Primary Objective of this project is to develop a robot that can be controlled
from a distance by the user through an interface. Key factors to consider when it
comes to designing a telerobot are as follows.

• Reduce latency considering all possible factors (Network latency and processing
latency)

• Telepresence - Getting video-audio feedback from the robot hence, the operator
feels present in the remote environment, projecting his or her presence through
the remote robot

• Provide reliable network

• Delay between Control signals and video-audio feedback should be reduced

To demonstrate the 5G telerobot scenario in this project we have designed a robot


that can be controlled in a environment with obstacles.

4
1.2 Literature survey

1.2.1 Telerobot networks types


Teleoperation is a field of robotics where the robot is controlled from a distance.
Most of the telerobot implementations follow the master-slave architecture where the
master controller is operated by a human while the slave robot acts according to the
commands. The distance between the slave robot and master controller can vary
according to the situation. There are teleoperated robots as in L. Bertinetto et.al [1]
where the robot is controlled from a very short distance. A Local Area Network can
be established in such cases as in [1]. Teleoperated robots are heavily used in surgical
systems as in J.R. Ohm et.al [2]. The latency and accuracy are very critical in such
cases.

1.2.2 Video Encoding Methods


Video broadcasting is essential in most telerobotic applications. There are different
methods used for this purpose. In some cases, as in P. Salva-Garcia et.al [3] third
party software has been used to broadcast video to the master controllers. These
software solutions are easier to adopt, and they are updated regularly with the latest
technologies by their respective manufacturers. Some solutions as in P. A. Chou et.al
[4] encode the video with encoders such as H.264/AVC. The implementation in D.
Wu et.al [5] shows a Real-Time video adaptation in a 5G network using H.265 which
reduces up to 50% bandwidth requirements when sending videos.

1.2.3 Real-time video transmission


There are several real-time video transmission methods in existing research.In ac-
cordance with A.K .Katsaggelos et.al [6] they concentrated on current developments
in packet-based video transmission in real-time. The video encoder compresses the
actual video signal first. By utilizing both temporal and spatial redundancy, compres-
sion minimizes the number of bits necessary to describe the video sequence. Because
the encoded video will be transmitted via a lossy communication channel, it must be
encoded in a way that minimizes the effects of errors on decoded video quality. Work
done by Ren, M. Liu et.al [7] explores the fine control methods of video transmission
systems to meet the quality requirements of real-time transmission systems, one type
of two-level comprehensive control method is used. In level 1, video coding is divided
into basic and expanded layers. When network loads are high, only the basic level is
transmitted; the basic level may be sufficient for a basic video display. Level 2 con-
tinues the dynamic regulation of data flows by using RTCP feedback and assessing

5
network load situations.

1.2.4 Delay and Reliability requirements


Delay plays an essential role considering the performance and speed of remotely con-
trolled machine operations. In the current 4G mobile networks, the actual delays in
the data link layer are in the order of 50-300 ms [8]. The delay and reliability require-
ments vary depending on the applications, for real-time gaming 50 ms delay with a
packet loss rate of 103, for interactive gaming 100 ms delay with a packet loss rate of
103, and streaming and file downloading 300 ms delay with a packet loss rate of 106.
These are user plane or data plane delays, which can be as low as10-20 ms because of
the short sub-frame of 1 ms [8]. Reliability is part of a larger concept of dependability
[9]. Low reliability may have similar effects in tactile interfaces as a large delay [10].
Reliability is a measure expressing when the system is available [11]. Reliability may
have different definitions, but in [11] reliability is the number of sent packets or more
generally protocol data units (PDUs) successfully delivered to the destination within
the time constraint required by the targeted service, divided by the total number of
sent packets. The following table shows the required QoS parameters for different
applications that are related to telerobotic [12].

Table 1.1: QoS requirements for diff rent applications


QoS parameter Audio Video Graphics Haptics
Delay ≤ 150ms ≤ 400ms 100-300ms 3-60ms
Jitter ≤ 30ms ≤ 30ms ≤ 30ms 1-10ms
Data loss rate ≤ 1% ≤ 1% ≤ 10% 0.01-10%
Data rate 22-200kbps 2,5-40Mbps 45kbps-1,2Mbps 128 kbps

6
1.3 Method of investigation

This project can be subdivided into three major sections as follows. For all those
sections we have analyzed possible alternative solutions thoroughly and selected the
best solution suitable for our requirement. This will be discussed further in chapter
2.

1.3.1 5G Network deployment


1.3.1.1 Core Network deployment
• When selecting method for deploying the 5G core network we have choose the
method that can deploy a 5G standalone network.

• When selecting the software platform for 5G standalone network OAI was se-
lected since it support RAN simulators and it has a huge developer community.

1.3.1.2 Radio Access network deployment


• When deploying the RAN there were two choices. Deploying the network using
virtual RAN simulator or Using SDR devices to deploy the network

• When selecting a virtual RAN simulator RF simulator was chosen because it is


compatible with OAI core.

• When selecting a SDR device USRPB B210 was chosen because it is compatible
with OAI core.

1.3.2 Robot Design


1.3.2.1 Fully Virtual Robot Simulation
• Most cost effective way.

• Performance heavily depends on the hardware platform which the simulator


runs upon.

• Easier and faster than building a physical robot.

1.3.2.2 A Life-sized Physical Robot


• The best and most accurate way to test the use case.

• Performance can be improved by using better hardware.

• Can be improved further based on customer feedback.

7
1.3.3 Tele-operation Web Application
1.3.3.1 WebSockets
• Use websockets for communication instead of peer-to-peer communication.

1.4 Principal results of the investigation

After undertaking the above procedure, following is a brief of results we obtained.


These results would be explained in detail in Chapter 3.

• In 5G core network and RAN firstly, we were able to connect one UE through a
one gNobeB and take network measurement such as ping and throughput (using
iperf test)

• We were able to connect the web application and the robot through LAN or
other mobile communication networks. We tested it through LAN, 4G network,
and 5G NSA network.

8
2 Methodology
In this chapter, the Methodology used for 5g network deployment, video, and au-
dio transmission, robot implementation, and web application development will be
discussed.
2.1 5G network deployment

Mobile Networks evolved from 1G to 5G adding new features and capabilities in each
generation. The first generation of mobile networks was only capable of voice commu-
nication. 2G mobile networks introduced data services and the core network of the 2G
mobile networks has two separate entities to handle data traffic and voice traffic. 3G
mobile networks enhanced the up-link and down-link capabilities of previous 2G mo-
bile networks. 4G mobile networks were a game-changer in mobile networks as it does
not have 2 separate cores to handle voice and data traffic. It has only a packet core.
Therefore the voice is also transmitted as data packets. There is one similarity in all
of these mobile networks, which that is they all use vendor-specific devices to perform
different core network functions. However, 5G uses Virtual Network Functions in its
core network.
In our project, we used 5G virtual core network functions from OpenAirInterface
Alliances by Eurocom. These virtual core network functions are open source and they
are written in C++ language. Therefore the configurations and debugging were more
flexible.
We deployed the core network functions using Docker. There is a separate docker
container for each core network function. Then the docker containers were networked
using Docker networking techniques. IP addresses were assigned to every docker
container in the IP range of 192.168.70.128/26.

2.1.1 5G core network deployment

2.1.1.1 5G Standalone vs Non-standalone architecture

When it comes to the 5G core network deployment there are two main architectures
available[13].

9
• Non Standalone Architecture(NSA)
NSA relies on the 4G network facilities to provide more speed and higher data
bandwidth. A 5G-enabled device will connect to a 5G or 4G network depending
on conditions. When it comes to NSA 5G, the clue is in the name: It’s 5G that
can’t stand on its own in terms of infrastructure. NSA is a 5G radio access
network (RAN) that operates on a legacy 4G LTE core – known as Evolved
Packet Core (EPC) – and manages control plane functions. NSA includes both
a 4G base station (eNodeB) and a 5G base station(gNodeB), but the 4G base
station takes precedence. Because the NR control plane anchors to the EPC,
radio frequency signals forward to the primary 4G base station.

The drawback of NSA 5G, however, is it can’t deliver certain capabilities that
a pure, unfettered SA 5G network can. For example, NSA doesn’t enable the
low latency which is one of the biggest draws to 5G. Due to this reason, NSA
is not suitable for URLLC applications such as telerobotics. Another disadvan-
tage of NSA is it requires a higher level of energy to power 5G networks with
4G infrastructure. 5G NR is more energy-efficient than LTE, IEEE reported,
but using two different forms of cellular technology massively increases power
consumption in a network.

• Standalone architecture(SA)
SA is the true 5G network, where the 5G network has its dedicated 5G facilities
to provide enormous speed improvements and minimal network latency (delay).
The 5G SA network is independent of the 4G network. SA 5G networks include
both a 5G RAN and a cloud-native 5G core, something NSA networks lack and
substitute with a 4G core. SA networks can perform essential 5G functions, such
as reducing latency, improving network performance, and centrally controlling
network management functions, because of their independent 5G cores. Because
of these reasons when it comes to applications such as telerobotics standalone
5G core network is the best candidate.

10
Figure 2.1: Non-standalone Vs. Standalone architecture

2.1.1.2 5G core network deployment options


To deploy the 5g core network, there were few available options.

Table 2.1: 5G Network implementation available options


Method Description
Already Established
Available Dialog 5G Network at
Unable to make significant changes
the University (Non-Standalone)
Must be in the university premises to
use
Not effective for URLLC applications
Radio Access Network and Core Fully Virtual
Network implementation using Changes can be done/ C++ program-
Open Air Interface ming language is used
SDR(Software Defined Radio) will be
used to connect edge devices to the net-
work
Can be implemented in our personal
computers with Ubuntu/Linux envi-
ronment
Open Air Interface Radio Access Changes can be done to the network
Network and 5G standalone Core SDR will be used to connect edge de-
(Cumucore) network available in vices to the network
the university Must be in the university premises to
use

11
But due to the pandemic situation since the labs are not open; the third option
which is using a fully virtual network with a software platform was used to deploy the
5g core network. There were several platforms for that, such as an open-air interface,
next EPC, and free 5gc. Among them, OpenAirInterface was chosen because it is an
open-source platform and it has a huge community. Also since OAI 5G network mainly
focuses on deploying a standalone 5g core network, Due to the reasons mentioned
previously it is more suitable for our application.

If we are deploying the network using physical devices, usually what we have
to do is implement the user plane function and control plane functions in 2 virtual
machines. In OpenAirInterface we can also deploy these network functions using
docker containers replacing the requirement of virtual machines.

We deployed the network using the second method. The advantage of This docker
container deployment method is it supports radio access network simulations. In our
project, we have used the following simulators.

• RF simulator

• gnBSim

2.1.1.3 5g core network functions

In 5G standalone architecture, 5G core mainly follows a Service-Based Architecture


(SBA) which is the recommended and programmable architecture design for the inter-
connection of 5G network functions in the Core and exposure of network capabilities
resources via Service message bus. SBA is more aligned with the 5G’s Service cen-
tric requirements of the Cloud-native model for Softwarized network functions (NFs),
Virtualized deployments of NFs at the Edges Core, and their integration with the
OSS/BSS layers.

In SBA, the approach is to adept/evolve/expose/develop the Network capabilities


based on the new generation of Services offered in the 5G world. As shown in figure
2.2, Network functions based on Control Plane are interconnected via the Service
message interface Bus for exposing the network capabilities within and outside the
Core Network.

Network functions connected via Service Bus in the Control Plane form the Service
Based Architecture (SBA) in the 5G Core Network.

12
In SBA, Network Functions (NFs) capabilities are exposed via REST APIs and
based on HTTP2.0 protocol. Interconnection between NFs can be based on the Re-
quest/Response model or Subscribe/Notify model for availing of the different 5G
Services[14].

Figure 2.2: 5G service based architecture

following are the main functions that are used in the 5G standalone core network[15].

• AMF (Access and Mobility Function)


It performs operations like Mobility Management, Registration Management,
Connection Management, etc. It acts as a single-entry point for the UE connec-
tion. Based on the Service requested by Consumer, AMF selects the respective
SMF for managing the user session context. When compared with 4G EPC, its
functionalities resemble with MME of the 4G Network.

• SMF (Session Management Function)


Performs operations like Session Management, IP Address allocation man-
agement for UE, User plane selection, QoS Policy enforcement for Control
Plane, used for Service registration/discovery/establishment, etc. When com-
pared with 4G EPC, its functionalities resemble MME, SGW-C (Control Plane),
and PGW-C (Control Plane) of 4G Network.

• UPF (User Plane Function)


It performs operations like maintaining PDU Session, Packet routing forward-

13
ing, Packet inspection, Policy enforcement for User plane, QoS handling, etc.
When compared with 4G EPC, its functionalities resemble SGW-U (Serving
Gateway User Plane function) and PGW-U (PDN Gateway User Plane func-
tion) of 4G Network.

• NRF (Network Repository Function)


It maintains the list of available Network Functions instances and their profiles.
It also performs Service registration discovery so that different Network func-
tions can find each other via APIs. As an example, SMF which is registered to
NRF; gets discoverable by AMF when UE tries to access a Service type served
by the SMF. Since Network functions are connected via Service Message Bus in
SBA, any authorized Consumers can access the services offered via registered
Network functions (Producers).

• AUSF (Authentication Server Function)


It allows the AMF to authenticate the UE. When compared with 4G EPC, its
functionalities resemble with HSS/AAA Server of the 4G Network.

• NSSF (Network Slice Selection Function)


It maintains a list of the Operator defined network slice instances. AMF autho-
rizes the use of Network slices based on the Subscription info stored in UDM or
it can query NSSF to authorize access to a Network slice based on the service
requirements. NSSF redirects the traffic to an intended Network slice.

• UDM (Unified Data Management)


It performs operations like User identification handling, Subscription Manage-
ment, User Authentication, Access Authorization for operations like Roaming,
etc. When compared with 4G EPC, its functionalities resemble with HSS/AAA
Server of the 4G Network.

• PCF (Policy Control Function)


It supports a Policy control framework, applying Policy decisions, accessing
Subscription information, etc. to govern the Network behavior. When compared
with 4G EPC, its functionalities resemble with PCRF of 4G Network.

• NEF (Network Exposure Function)


It exposes services and resources over APIs within and outside the 5G Core. 5G
Services exposure by NEF is based on RESTful APIs. With the help of NEF,
3rd party applications can also access the 5G services.

14
• AF (Application Function)
It performs operations like accessing Network Exposure Function for retrieving
resources, interacting with PCF for Policy Control, exposing services to End
users, etc. When compared with 4G EPC, its functionalities resemble with AF
of the 4G Network.

2.1.1.4 OAI core network deployment

As mentioned in the previous section OAI was used to deploy a virtual core network.
Each of the above-mentioned core network functions is available as docker images.
Each of the functions has there own repository. They have to be deployed in Docker
containers and each function has a unique IP address that allows the communica-
tion between functions. Some repositories related to network slicing such as NSSF,
NWDAF(Network data analytic function), UDSP (Unstructured data storage func-
tion), and VPP (Vector packet processing) based UPF are private since they are
still in development. Therefore some functionalities such as network slicing are not
achievable until they are published.

Figure 2.3 shows 5g core network functions we have deployed using OAI architecture.[1]
Following docker containers were deployed in order to implement the core network,

• OAI-AMF - 192.168.71.132

• OAI-SMF - 192.168.71.133

• OAI-NRF - 192.168.71.130

• OAI-UPF (OAI-SPGWU) - 192.168.71.134, 192.168.72.132

• OAI-EXTdn - 192.168.71.135

• MySQL Server - 192.168.71.131

15
Figure 2.3: OpenAirInterface 5G core network architecture

Usually, in a 5G SA network, there is a database that integrates with the AMF


function but in OAI architecture it is done separately with a MySQL database as a
separate docker container. Also, the IP pool for UEs in the SMF function is config-
ured here between 12.1.1.2 -12.1.1.128 Ext-dn, represent an external data network for
simulation purpose, instead, we can also connect to the internet.

16
2.1.2 5G Radio Access Network Deployment
Radio Access Network is essential in any mobile network and it allows user equipment
to connect to the network through a radio interface. In our project, we have two user
equipment. One user equipment is the laptop or a 5G enable mobile device that runs
the teleoperation application and the other user equipment is the robot itself. There
were 2 available options that allow implementation of the radio access network,

1. Virtual Implementation using RAN Simulators

2. Implementation with USRP(Universal Software Radio Peripheral)devices

2.1.2.1 Virtual Implementation using RAN Simulators


Open Air Interface not only provides virtual core network functions but also pro-
vides code blocks that work as virtual gNodeB and Virtual 5G User Equipment that
represent the actual behavior of real gNodeBs and commercial off the shelf 5G user
equipment.

In a real cellular network, a base station and a UE is connected through a radio


interface. However, in our virtual setup, we used Radio Interface Simulators such
as gnbsim and RFSimulator to simulate the Radio Access Network. However, these
simulators are in the early stages of development and they support a very limited
number of features. Currently, they support only establishing a virtual connection
between a virtual gNB and a virtual 5G Ne Radio User equipment. They don’t
support advanced features like implementing propagation models.

17
Figure 2.4: Connecting 2 virtual UEs to core network through virtual gNobeBs

Connecting and Starting the gNodeB

As in the above figure, we host the core network functions in one computer and then
connect two other computers using a LAN connection. We host a pair of virtual
gNodeB and virtual UE in these two computers. Following configurations are needed
for proper connectivity.

when configuring the gNodeB we have to match the following parameters in the
Access and Mobility Function (AMF) with the gNodeB configuration file as shown in
figure 2.5.

• TAC (Tracking Area Code)

• gNodeB ID

• MCC (Mobile Country Code)

• MNC (Mobile Network Code)

18
Figure 2.5: gNodeB configuration file

In addition to that, we have to put the IP address of the host machine in which
we deployed the virtual gNodeB in the following section of the gNodeB configuration
file as shown in figure 2.6.

19
Figure 2.6: Network interfaces in gNobeB

After successfully configuring the gnb configuration file, we can execute the fol-
lowing command to turn on the gNodeB.

Figure 2.7: Command to turn on gNodeB

20
Connecting and Starting the UE

As already mentioned in the previous section There is a MySQL database in the


core network and it contains data related to User Equipment. We have to add a new
entry in our database every time we connect a new UE to the core network. Mainly
the listed below parameters are considered when we are adding a new entry to the
database. These values in the UE configuration file should exactly match the values
in the database entry.

• IMSI (International Mobile Subscriber Identity)

• key

• opc

Figure 2.8: UE Configuration file for new UE

Figure 2.9: Data Base Entry related to the new UE

after successfully configuring the UE we can use the following command to turn on
the virtual UE on RFSimulator mode.

Figure 2.10: Command to turn on virtual UEs using RFsimulator mode

21
The virtual UE automatically gets attached to the gNodeB when the RFSimulator
mode is on. Access Mobility Function(AMF) of the core network identifies the UE as
soon as it attaches to the gNodeB. Therefore by analyzing the log files of the AMF
we can confirm that the UE is connected to the core network through the gNodeB.

We deployed 2 pairs of virtual gNodeB and virtual UE and both gNodeBs con-
nected to the core network as shown in the architecture diagram. We used this setup
for testing and to take network measurements. A picture of the real setup is shown
below in figure 2.11.

Figure 2.11: Vittual UE and gNobeB setup

22
2.1.2.2 Implementation with USRP Devices and 5G modules

As mentioned earlier we have 2 ends. One end is the robot and the other end is the
teleoperation application. We only have one USRP B210 device and one Simcomm
5G module available at the Dialog Research Lab. They are expensive devices and
we need the approval and special clearances needed to import them to Sri Lanka as
they are 5G devices. Therefore our initial plan was to deploy 2 gNodeBs whereas
one gNodeB is deployed using USRP B210 and the other gNodeB using a simulator.
Then we connect the robot to the USRP gNodeB using the simcomm 5G module
and run the application on a laptop that is connected to the simulated gNodeB. The
Following Diagram shows the architecture.

Figure 2.12: Architecture with USRP device

We successfully deployed the OpenAirInterface gNodeB on the USRP B210 de-


vice with UHD 3.15LTS version. It successfully transmits the 5G signals. However,
we faced compatibility issues between the OAI gnb and Simcomm 5G module. We
later found out that the USRP implementation of gNodeB works well with the 5G
communication modules from Quectel. But we were unable to import it because it is
very expensive and the module will not be delivered within the time frame.

23
Figure 2.13: Setup with USRP module

24
2.2 Teleoperation Application

We have designed a web interface to view the video feed from the robot and to control
the robot We design a web application using the webRTC framework. WebRTC stands
for web real-time communication and in webRTC, there are no servers involved to
transfer real-time data from one client to another. The webRTC connections are
peer-to-peer and because of this sending data through the webRTC is very simple all
of this is done through webRTC APIs which are available for almost all the platforms.
when I said that there are no servers required for webRTC I was not completely right
about it. There are servers required initially for the client computers to get connected
but once the connection is established then these clients can communicate with each
other without the need of having a server.

Figure 2.14: WebRTC Architecture

What happens is this operator end will send some data about itself to the signal-
ing server and then the signaling server will send that data to the robot end and then
this client will store this data after that robot side client will send data about itself
to the server and then the server will send that data to the operator side. And now
both these client computers know about the media configurations of each other but
they still do not know how to connect and for that they will have to transfer their
network information. The network configuration data that needs to be transferred
for the connection to happen is something called ICE candidates.ICE candidates are
generated from something called stun and turn servers and now to be able to get the
ICE candidates from the stun and turn servers for the client what it needs to do is it
has to provide the URL of the stun and turn servers to the webrtc’s API and then as
soon as this client creates an offer the webrtc’s API will start to get ICE candidates

25
from the stun and the turn servers and now the job for this client is to send this
ICE candidate to the other client. Therefore, it will send the ICE candidate to the
server first and then the server will send that candidate to the client too. Now both
clients have network information about each other. Therefore, the connection will
get established between them and now they can send their data without the need of
having a server.

We hosted our signaling server to Heroku. Heroku is a container-based cloud Plat-


form as a Service (PaaS). Developers use Heroku to deploy, manage, and scale modern
apps.

Figure 2.15: Heroku Server

And we designed our own turn server to use for our application because the ex-
isting turn servers did not work properly. They got a significant delay to establish
the connection. We use the DigitalOcean cloud platform to host our turn server.

26
Figure 2.16: TURN Server

As mentioned in Figure (2.14) in the robot end two threads were used, one for con-
trolling actuators and the other one for autonomous braking. This will be explained
in detail in the next chapter.

2.2.1 Operator Interface


For designing our web application we used JavaScript as our programming language
and used bootstrap 4 for user interface designing. The following figure shows the
robot operator interface that we designed to view the video feed from the robot and

27
control the robot.
In the Top left side window, it shows a video stream of the person who controls the
robot. We can minimize this window according to the user’s preference and also the
user can mute audio and stop video according to his preference. The larger window on
the right side integrates the video stream of the robot. Also, there are commands in
this web application and when we are pressing the corresponding key on the keyboard
we can control the moment of the robot.

Figure 2.17: Operator Interface

28
2.3 Robot Design

To demonstrate the teleoperation application of the 5G technology, a robot has to


be designed. Due to the situation in the country at the time of the initial phase of
the project, the plan was to demonstrate the robotic part using a robot simulation
platform. After getting access to the lab facilities and necessary services after the mid
evaluations, designing a physical robot was proposed and subsequently, the relevant
work is carried out. To fully demonstrate the application, it is required to have a
life-sized robot. But due considering the budget and time restrictions, a miniature
version of the robot was planned to design.

Figure 2.18: Original Conceptual Design of the Robot

2.3.1 Objective of the Robot Design


The robot was designed to demonstrate the telerobotic application of 5G technology.
It is proposed to design a robot to operate in a crowded restaurant as a waiter robot.
To demonstrate the application, the robot has to have some mandatory capabilities.

1. Basic maneuvering capability

2. Tele operability

29
3. Real-time Audio/Video communication

4. Autonomous Collision avoidance

2.3.1.1 Basic Maneuvering Capability


This feature enables the robot operator to navigate the restaurant to achieve given
tasks, usually delivering food from the kitchen to the customers. A simple differential
drive mechanism is proposed to achieve maneuvering due to its simplicity and effi-
ciency. The commands to maneuver are given through the 5G network by the robot
operator at a remote location.

2.3.1.2 Real-Time Audio/Video Communication


Real-time audio and video communication are needed to communicate between the
robot end and the operator end continuously. This audio and video feedback from
the robot is given to the robot operator to achieve a closed-loop control mechanism.
The audio and video communication link from the operator side to the robot side is
used to communicate with the customers. To capture the audio and video feed from
the robot end, a web camera is used.

2.3.1.3 Collision Avoidance


The application of this robot is to work as a waiter robot in a restaurant. Usually,
a restaurant can be considered a crowded place that consists of various static and
dynamic obstacles. To avoid any hazards and to deliver food with care, the robot
has to have a collision autonomous avoidance mechanism. To measure the distances
to obstacles, a distance measuring sensor was used. The initial plan was to use a
LiDAR sensor or a Time-of-Flight sensor. But due to lack of component availability
and economical reasons, an ultrasonic distance measurement sensor was used.

2.3.1.4 Robot Design process


The robot design process started with brainstorming on how to build the robot, which
components are to use, and why. After considering various options, the following spec-
ifications were finalized.

30
Table 2.2: Component Selection
Type Component Reason to Choose
Robot controller + Communication Module Raspberry Pi 4 B Performance and availability
Motors Pololu 37D Metal Gear Motors with Encoders Performance and availability
Motor Driver Dangaya 2.0 Motor Driver Compatibility and Availability
Distance Measurement Sensor HC-SR-04 Ultrasonic Sensor Low cost
Camera and Mic Generic Web Cam Low cost and availability
Power Source 11.1V 2200mAh Li-Po Battery Availability
Voltage regulators Buck booster and Buck converter Availability
Backup Communication module SimCom 5G UE Module Availability
Backup Power Source Mi 10000mAh power bank Availability

2.3.2 Components Used


2.3.2.1 Raspberry Pi 4B
Raspberry pi 4 b is a single board computer that is widely used in the robotic com-
munities. The technical specifications of this module are as follows.

• Broadcom BCM2711, Quad-core Cortex-A72 (ARM v8) 64-bit SoC @ 1.5GHz

• 4GB LPDDR4-3200 SDRAM

• 2.4 GHz and 5.0 GHz IEEE 802.11ac wireless, Bluetooth 5.0, BLE

• Gigabit Ethernet

• 2 USB 3.0 ports; 2 USB 2.0 ports.

• Raspberry Pi standard 40 pin GPIO header (fully backward compatible with


previous boards)

• 2 × micro-HDMI ports (up to 4kp60 supported)

• 2-lane MIPI DSI display port

• 2-lane MIPI CSI camera portare

• 4-pole stereo audio and composite video port

• H.265 (4kp60 decode), H264 (1080p60 decode, 1080p30 encode)

• Micro-SD card slot for loading operating system and data storage

• 5V DC via USB-C connector (minimum 3A*)

• 5V DC via GPIO header (minimum 3A*) [16]

31
Figure 2.19: Raspberry Pi 4B Single Board Computer

2.3.2.2 Pololu 37D Metal Gear Motors


Pololu 37D metal gear motor is one of the most popular gear motors among the
robotic communities. This motor requires an input voltage of 12V and has a stall
current of 5.5A. Pololu 37D comes with a quadrature encoder which is powered by a
Hall sensor and provides 400 counts per revolution.

Figure 2.20: Pololu 37D Metal Gear Motor with Encoders

32
2.3.2.3 Dangaya 2.0 Motor Driver
Dangaya 2.0, officially Aptinex Dangaya 2.0 is a VNH5019 based dual motor driver.
The operating voltage of this module is from 5.5V to 24V while providing a continuous
current supply of 12A per channel. This motor driver is compatible with the Pololu
37D motors which demand a 12V input.

Figure 2.21: Aptinex Dangaya 2.0 Dual Motor Driver

2.3.2.4 HC-SR-04 Ultrasonic Sensor


This is a simple and compact module that can be used to detect obstacles and cal-
culate the distance to obstacles using a simple mechanism. There are two . . . . In
the module, one for emitting the ultrasonic wave and the other one for detecting the
reflected wave. Using this module, we can calculate the distance to an obstacle ahead
with the following equation.

TimeDi f f erence × SpeedO f Sound


Distance =
2

33
Figure 2.22: HC-SR04 Ultrasonic Sensor

Figure 2.23: Distance Measurement Mechanism of Ultrasonic Sensors

The advantage of using this module is this it is simple to set up and use. Fur-
thermore, this module does not need additional hardware or additional processing
overhead to operate. The main disadvantage of this module is, that its optimal dis-
tance range is limited unlike ToF sensors and LiDAR sensors.

34
Figure 2.24: Effective Distance Measurement Range for Ultrasonic Sensors

2.3.2.5 Generic Web Camera


This is a 720p web camera that has a microphone built into it and can be easily
found on the market. The advantage of using this webcam, it is not needed to use a
dedicated mic to capture the audio feed since it has a built-in microphone.

Figure 2.25: Web Camera

35
2.3.2.6 Voltage regulators
In this robot design, two types of voltage regulating modules were used. A 120W
buck-boost converter and two XL4015 buck converters. The buck converters were
used to step down the voltage to operatable levels. One was used to get a 5V output
and the other was used to get a 12V output. The motors used in the robot demand
a 12V input voltage, even though the output voltage of the battery is only 11.1V. To
address this issue, a 120W buck-boost converter was used. Which the help of this
module, the battery’s output voltage was boosted up to 15V and stepped down back
to 12V, which is the suitable voltage for the motors.

Figure 2.26: XL4015 Buck Converter

Figure 2.27: 120W Buck Boost Converter

36
2.3.2.7 SimComm 5G UE Module
This module was used as a backup communication module for the robot. By using
this module, the robot can be connected directly to a base station. The configurations
of the SimCom 5G module are done through AT commands, which can be used to
connect the device to the 5G base station upon availability. The following diagram
shows a serial monitor that communicates with the SimCom module using AT com-
mands. The first command which is AT+COPS=? Shows the available networks
that can be connected through the module. The last integer parameter shows what
type of network it is.

Figure 2.28: SimComm 5G UE Module

37
Figure 2.29: Communicating with SimComm 5G UE Module with AT Commands
through a Serial Monitor

Table 2.3: Network Types


Parameter Network Type
0 GSM (2G)
1 GSM Compact
2 UTRAN (3G)
6 UTRAN_HSDPA_HSUPA (3G)
7 EUTRAN (4G LTE)
8 EC_GSM_IOT
9 EUTRAN_NB_S1
11 NR_5GCN (5G SA)
12 NGRAN (5G NSA)
13 EUTRA_NR

2.3.3 Building the Robot


2.3.3.1 SolidWorks Design
Before the robot was built, a CAD design of the robot was made to guide the robot
building process. To design the CAD drawing, SolidWorks was used. It was quite
straightforward to make the robot after the completion of the CAD design.

38
Figure 2.30: CAD Drawing of the Robot

2.3.3.2 PCB Design


To reduce the complexity of wiring, a PCB was introduced to the robot. This PCB
mainly does the power and signal regulations and routings. The PCB was designed
using Altium and it is a two-layered PCB. The schematic and the layout of the PCB
are attached as appendices.

The final robot design for the demonstration of the application is as follows. The
robot operates with a multi-threaded setup. One thread for control command and the
other thread for autonomous breaking. When the control command JSON is received
by the web app, it sends the control command JSON to the control command thread
through a Flask App. The other thread continuously measures the distance to any
obstacles ahead and applies break if any obstacle is detected within 20cm.

39
Figure 2.31: Final Design of the Robot

Figure 2.32: Sample Control Command JSON Object

40
2.3.4 Suggestions for Further Improvements
2.3.4.1 Build a Life-sized Robot
By building a life-sized robot, it can be tested in an actual restaurant environment
rather than operating in a small arena. This will help to identify things that can be
further improved when it comes to an actual operating environment.

2.3.4.2 Use More Sophisticated Hardware


By using more sophisticated hardware for the robot, we can reduce additional laten-
cies added to the equation due to hardware limitations. For example, using a more
capable computer rather than a Raspberry Pi SBC, latencies can be minimized while
providing higher quality audio/video feeds. Furthermore, by introducing a LiDAR
sensor for distance measurement, obstacle detection and autonomous breaking func-
tionalities can be made more reliable than the current setup. By adding a display
to the robot, the overall user experience can be improved by a huge margin due to
the customer can actively communicate with the robot operator who is at a remote
location.

2.3.4.3 Test the Robot on a 5G SA Network


The currently available implementation of 5G technology in Sri Lanka is a 5G NSA
version that uses the same core network as the 4G network. To experience the real
capabilities of a 5G network, it is required to have a 5G SA network. The performance
of the overall project and the customer experience can further be improved by using
a 5G SA network instead of an NSA network.

2.3.5 Alternative Methods


At the initial phase of the project, the plan was to demonstrate the robotic part
using a robot simulator. Due to the Covid-19 pandemic at the time, the most fea-
sible solution was to simulate the functionality of the robot using a robot simulator
without implementing a physical robot. The use case is chosen as a Waiter Robot
for a Restaurant. Initially, two candidates were considered for the robot simulation
platform

• Webots

41
• Gazebo

Pros and cons of each platform can be analysed as follows[?].

Table 2.4: Webots vs Gazebo


Simulator Pros Cons
Faster learning curve Few Plugin Options
Webots Larger Community
Simple design
Larger Community Slower Learning curve
Gazebo
Bigger pool of plugins

After considering the pros and cons of Webots and Gazebo, to simulate the robot and
the restaurant environment, an open-source simulator, Webots was chosen. A simple,
yet complete environment of a restaurant is designed from scratch using Webots. The
environment has a few dining tables, chairs, a dining area, and a pantry area. To
demonstrate the functionality of the robot, a simple robot design was used. The
robot consists of 2 driving wheels, 1 passive wheel, a camera, and a platform to carry
the food. The robot can be controlled using a keyboard, and at the moment it gives
the camera feedback into a new window. Later, this is to be integrated into the
web app and the video feedback and robot controls are to be displayed in it. The

Figure 2.33: Simulation Enviroment

controller of the robot is implemented in Python. NumPy and OpenCV libraries


are used to process the camera output. This should be integrated into the WebApp

42
Figure 2.34: Robot Design

using WebRTC in the later stages. The main drawback of this method was, that
the performance of the robot simulation was heavily dependent on the performance
of the hardware platform used to run the simulation. The bottleneck of this method
was the performance of the laptop that the simulation was run.

Figure 2.35: Camera Feed

43
3 Results
In this chapter, we discussed intermediate results. analysis of requirements, and
limitations of implementation.
3.1 5G network Deployment

We proceeded with the fully virtual setup discussed in a previous section to take
network measurements and perform other tests. We used well known network mea-
surement tools such as “iperf”, “iperf3” and “ping” for taking the values. Wireshark
was also used to analyse the packets. We used log file of AMF( Access and Moibil-
ity Function) for debugging purposes.It provides a high level understanding and live
status report about the network.

3.1.1 Ping
3.1.1.1 Pining from UE to Ext-dn
We used the ping network measurement tool simultaneously in both UEs and pinged
to EXT-DN block (192.168.72.135) which represents the external data network. An
Average ping value of 38ms was reported in the first UE and the second UE reported
an average ping value of 44ms. Screenshots of the UEs and a graph representing pings
values are shown below.

Figure 3.1: Ping from UE1 to Ext-dn

44
Figure 3.2: Ping from UE2 to Ext-dn

Figure 3.3: Ping values

45
3.1.1.2 Pinging between UEs
The following figure shows a screenshot of AMF logs taken when 2 UEs are connected
to the core network through 2 gnodeBs. It confirms that the proposed architecture
has been deployed.

Figure 3.4: AMF logs

We have configured the Session Management Function (SMF) of our core network
to assign IPs for the UEs in the range of 12.1.1.2 -12.1.1.127. Therefore when we
connect two virtual UEs to the core network one UE was assigned with 12.1.1.2 IP
address and the other UE was assigned with 12.1.1.3 IP address. To measure the
connectivity and round trip time between the UEs we ping from 12.1.1.3 to 12.1.1.2
.Following figure shows the execution. We reported an average round trip time of
738ms

Figure 3.5: Ping values

46
We used Wireshark to capture packets in the core network while pinging from
one UE to another. The following screenshot of Wireshark shows that packets from
12.1.1.3 to 12.1.1.2 traverse through the 5G core network. It indicates that ping
requests and replies both go through the core network. In addition to that, commu-
nication between different core network functions can also be seen.

Figure 3.6: wireshark analysis

3.1.2 Iperf test


We measured the bandwidth of the link between one UE to the core network. Iperf
was used to take the measurement. We deployed an iperf server in one UE and iperf
client on one of the core network function (EXT-DN). The results are shown below.

47
Figure 3.7: Throughput values

Table 3.1: Iperf test results


Scenario Average ping values Maximum TCP Maximum UDP
(ms) bandwidth (Mbps) bandwidth (Mbps)
When core and 554.2 4.02 4.02
RAN in separate
machine

These values heavily depend on the performance of hardware. Open Air Interface
5G core network also has its own limitations. The official website of the software
platform indicates that the current version of OAI doesn’t fully support the URLLC
use case. However, these value can be further improved with the newer version of
OAI and more capable hardware.

48
3.2 Teleoperation Application

The web application and the robot can be connected through LAN or other mobile
communication networks. We tested it through LAN, 4G network, and 5G NSA
network. We measured the latency to transfer the control commands from operator
to robot end by controlling the robot in different kinds of networks. We used our
method for measuring these delay values. The following diagram shows how we
measure latency.

Figure 3.8: Latency measuring method

A simple method to measure the latency in the data channel was implemented. The
data channel is used to transmit the control commands from the operator end to the
robot end. First, the time is logged (say t1 ) from the operator end when the control
command is issued. When the control command is received at the robot end, it
transmits back the control command to the robot end. When the control command is
received back to the operator, the time is logged again (say t2 ). Then the transmission
delay (say td ) is calculated as,
(t2 − t1 )
td =
2

49
This calculation was carried out based on two assumptions.

1. The transmission delay is the same for both ways.

2. The time taken to send back the control signal at the robot end is negligible
compared to the transmission delay.

Using this method we measured the delay values for different networks. As men-
tioned earlier these are the delay values to transfer the control commands from the
operator to the robot end. The latency for receiving the video feed from the robot is a
little bit different than these values. The following graph shows a latency comparison
between different networks.

Figure 3.9: Latency comparison between different networks

The graph indicates that the highest latency is observed in the 4G network. The aver-
age delay value is around 42 ms. And the lowest latency is obtained in the local area
network which has an average value of 6ms. And from the NSA 5G network, we have
observed average latency valued around 24 ms. Due to the hardware performance of
our computers and the signal quality of the 5G network this 5G network delay value
is higher than the expected delay values. We hope that if possible we can overcome
these challenges we will be able to achieve a latency that is lower than 20ms.

50
4 Discussion and Conclusion
The Tele Robotics Through 5G project is meant to explore the possibilities of using
5G technology to operate a robot at a remote location. The main goal of the project
is to demonstrate the Ultra-Reliable Low Latency Communication (URLLC) use case
of 5G mobile networks. To this end, our group has developed a Miniature Robot that
can be controlled through any mobile network (from 2G-5G), a web application to
remote operate the robot, and a virtual 5G standalone core network.
Our initial strategy was to connect the robot to the virtual 5G standalone core
network using the USRP devices. However, due to compatibility issues, this attempt
failed as discussed in the above section. Therefore we connected to our robot and the
web application through the non-standalone 5G network by Dialog which is available
at the university premises. We were able to manipulate the robot remotely using this
5G non-standalone network with significantly lower delay values. The high bandwidth
was also experienced and it provided smooth video feedback from the robot to the
operator. However, these values we got from the 5G non-standalone network are not
good enough to demonstrate the URLLC use case of 5G because URLLC is about
achieving latency values as low as 1ms.
Although we couldn’t connect the robot to the standalone 5g core network de-
veloped by ourselves, we were able to get network measurements. We were able to
measure the round trip time and bandwidth when two user equipment communicate
with each other through our 5G standalone core network. We were able to success-
fully establish the connection between 2 UEs after days of learning about 5G core
network functions. We gained a significant understanding of how 5G core network
functions should be configured and what each of the core network functions does.
Finally, we would like to conclude that the project is a success. The project
covered 3 totally different areas of technology by building a robot, building a web
application, and building an end-to-end virtual 5G standalone network. All these
were built within a time period of one year. The project can be improved in the
coming years as well. From our point of view, the future work of this project should
be focused to connect the robot and the application through the 5G standalone core
network using USRP devices.

51
References
[1] L. Bertinetto, J. Valmadre, J. F. Henriques, A. Vedaldi, and P. H. S. Torr,
“Fully-convolutional siamese networks for object tracking,” 2016.

[2] J.-R. Ohm, G. J. Sullivan, H. Schwarz, T. K. Tan, and T. Wiegand, “Comparison


of the coding efficiency of video coding standards—including high efficiency video
coding (hevc),” IEEE Transactions on circuits and systems for video technology,
vol. 22, no. 12, pp. 1669–1684, 2012.

[3] P. Salva-Garcia, J. M. Alcaraz-Calero, Q. Wang, M. Barros, and A. Gavras,


“Real-time video adaptation in virtualised 5g networks,” in 2019 IEEE 44th Con-
ference on Local Computer Networks (LCN), 2019, pp. 214–217.

[4] P. A. Chou, A. E. Mohr, A. Wang, and S. Mehrotra, “Error control for receiver-
driven layered multicast of audio and video,” IEEE Transactions on multimedia,
vol. 3, no. 1, pp. 108–122, 2001.

[5] D. Wu, Y. T. Hou, and Y.-Q. Zhang, “Transporting real-time video over the
internet: Challenges and approaches,” Proceedings of the IEEE, vol. 88, no. 12,
pp. 1855–1877, 2000.

[6] A. K. Katsaggelos, Y. Eisenberg, F. Zhai, R. Berry, and T. N. Pappas, “Advances


in efficient resource allocation for packet-based real-time video transmission,”
Proceedings of the IEEE, vol. 93, no. 1, pp. 135–147, 2005.

[7] Z. Ren, M. Liu, C. Ye, and H. Shao, “The real time video transmission system
based on h. 264,” in 2009 International Conference on Web Information Systems
and Mining. IEEE, 2009, pp. 270–274.

[8] H. Holma and A. Toskala, LTE for UMTS: OFDMA and SC-FDMA based radio
access. John Wiley & Sons, 2009.

[9] A. Avizienis, J.-C. Laprie, B. Randell, and C. Landwehr, “Basic concepts and tax-
onomy of dependable and secure computing,” IEEE transactions on dependable
and secure computing, vol. 1, no. 1, pp. 11–33, 2004.

[10] Z. Shi, H. Zou, M. Rank, L. Chen, S. Hirche, and H. J. Muller, “Effects of packet
loss and latency on the temporal discrimination of visual-haptic events,” IEEE
Transactions on Haptics, vol. 3, no. 1, pp. 28–36, 2009.

[11] A. Kostopoulos, G. Agapiou, F.-C. Kuo, K. Pentikousis, A. Cipriano,


D. Panaitopol, D. Marandin, K. Kowalik, K. Alexandris, C.-Y. Chang et al.,
“Scenarios for 5g networks,” in Conference: 2016 23rd International Conference
on Telecommunications (ICT), 2017.

52
[12] M. Eid, J. Cha, and A. El Saddik, “Admux: An adaptive multiplexer for haptic–
audio–visual data communication,” IEEE Transactions on Instrumentation and
Measurement, vol. 60, no. 1, pp. 21–31, 2010.

[13] R. Bhardwaj, “What’s happening with standalone 5G?” https://www.techtarget.


com/searchnetworking/tip/Whats-happening-with-standalone-5G, 2021, [ac-
cessed 25-May-2022].

[14] arvindpdmn, “5G Service-Based Architecture,” https://devopedia.org/


5g-service-based-architecture, 2021, [accessed 25-May-2022].

[15] R. pathak, “A begineer guide for 5G core network architecture,” https://www.


rajarshipathak.com/2020/01/beginners-guide-for-5g-core-network-architecture.
html, 2021, [accessed 25-May-2022].

[16] “Raspberry Pi Official Web Site.”

53
Appendix

Figure 4.1: PCB Schematic

54
Figure 4.2: PCB layout

55

You might also like