Professional Documents
Culture Documents
January, 2022
Approval of the Department of Electronic & Telecommunication
Engineering
......................................
Head, Department of Electronic &
Telecommunication Engineering
This is to certify that I/we have read this project and that in my/our opinion it is
fully adequate, in scope and quality, as an Undergraduate Graduation Project.
Signature: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Date: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
i
Declaration
This declaration is made on June 22, 2022.
................ ......................................
Date Abeysundara A.C (170008X)
......................................
Jalath H.P (170244P)
......................................
Jayasinghe J.A.M.D (170263X)
......................................
Opatha O.W.H.P (170416V)
ii
Declaration by Supervisor
I/We have supervised and accepted this dissertation for the submission of the
degree.
................................... ...........................
Dr. Kasun Hemachandra Date
................................... ...........................
Dr. Peshala Jayasekara Date
iii
Abstract
In this telecommunication era, the world is constantly moving through various gen-
erations of mobile communication technologies. Moving from 3G, 4G, and 5G, one
generation to the next, mobile communication technology is going through a huge
in-built improvement in various characteristics while adding new functions. Tele-
operations is an important field in robotics. From surgical robots to space robots,
they have found applications in different areas. When it comes to telerobotics, to
get the expected performance, a large amount of data is required to be continuously
transferred through a high-speed network. Therefore, telerobotics generally requires a
network with high network bandwidth and a minimum packet loss, a minimum delay,
and a capacity for real-time response. Since URLLC, or Ultra-Reliable Low Latency
Communication, is defined as one of the key uses of 5G technology, it is very suitable
for use cases such as telerobotics. This paper discusses the development of a simple
telerobot that uses 5G technology to establish communication between the user and
the robot. We are mainly focused on avoiding the typical problems of currently used
wireless or wired networks by using a 5G network. The end solution described in
the paper focuses on establishing a telerobotic system with a minimum latency while
ensuring the other QoS parameters such as packet loss.
The remaining section of this report is organized as follows: in section II we give
an overall introduction to the project and the related work of telerobotic applications
under the different types of network conditions; in section III we present the system
architectural design of 5G robotic system we have deployed supported by SDN and
Network Function Virtualization (NFV) also web application design and robot im-
plementation; in section. IV, describe a comparative study of 4G and 5G analysis by
key performance indicator. In section VII, we have highlighted results and discussion
of suggested architecture; finally, we draw our conclusion in section VIII.
iv
Dedication
Firstly, we dedicate our Final Year Project, to the Department of Electronic and
Telecommunication Engineering, University of Moratuwa, the place where we gained
the knowledge and skills to make our project a success.
Then we dedicate our project to our dear lecturers who constantly supported,
guided, and encouraged us to apply the best of our abilities to the accomplishment
of our project.
Furthermore, we dedicate this project to our parents and teachers who have made
immense sacrifices to bring us to the position where we are today.
Finally, we would like to dedicate our project to the general community who wish
to apply their knowledge to solve real-world problems by coming up with innovative
solutions for the betterment of society.
v
Acknowledgements
First, We would like to express our sincere gratitude to our supervisors, Dr. Kasun
Hemachandra and Dr. Peshala Jayasekara, for their continuous guidance, support,
and commitment to the success of this project.
We are thankful to our Head of the Department Dr. Ranga Rodrigo, final year
project lecturer coordinator Dr. Mevan Gunawardena and all other lecturers in our
department who gave valuable comments and suggestions during presentations and
helped us to improve our results of this project.
We extend our gratitude to our external collaborators Eng. Shamil Dilshan
Premathunga, Eng. Pasan Dharmasiri from Dialog innovation Lab - University of
Moratuwa for the eminence support given throughout the project.
We would like to express our sincere gratitude to Ms. Srianthie Salgado for her
valuable advice to make better reports in a professional style for our project.
Finally, we are being thankful to our dear parents for making numerous sacrifices
and contributions to make our project a success within the given time frame and we
thank all those who directly and indirectly helped us to make this project a success.
vi
Contents
Declaration ii
Abstract iv
Dedication v
Acknowledgements vi
Contents viii
1 Introduction 1
1.1 Nature and the scope of the problem . . . . . . . . . . . . . . . . . . 1
1.1.1 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1.2 Proposed Solution . . . . . . . . . . . . . . . . . . . . . . . . 2
1.1.2.1 Why use 5g? . . . . . . . . . . . . . . . . . . . . . . 2
1.1.3 Project Architecture . . . . . . . . . . . . . . . . . . . . . . . 3
1.1.4 Primary Objectives . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2 Literature survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.2.1 Telerobot networks types . . . . . . . . . . . . . . . . . . . . . 5
1.2.2 Video Encoding Methods . . . . . . . . . . . . . . . . . . . . . 5
1.2.3 Real-time video transmission . . . . . . . . . . . . . . . . . . . 5
1.2.4 Delay and Reliability requirements . . . . . . . . . . . . . . . 6
1.3 Method of investigation . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.3.1 5G Network deployment . . . . . . . . . . . . . . . . . . . . . 7
1.3.1.1 Core Network deployment . . . . . . . . . . . . . . . 7
1.3.1.2 Radio Access network deployment . . . . . . . . . . . 7
1.3.2 Robot Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.3.2.1 Fully Virtual Robot Simulation . . . . . . . . . . . . 7
1.3.2.2 A Life-sized Physical Robot . . . . . . . . . . . . . . 7
1.3.3 Tele-operation Web Application . . . . . . . . . . . . . . . . . 8
1.3.3.1 WebSockets . . . . . . . . . . . . . . . . . . . . . . . 8
1.4 Principal results of the investigation . . . . . . . . . . . . . . . . . . . 8
2 Methodology 9
2.1 5G network deployment . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.1.1 5G core network deployment . . . . . . . . . . . . . . . . . . . 9
2.1.1.1 5G Standalone vs Non-standalone architecture . . . . 9
2.1.1.2 5G core network deployment options . . . . . . . . . 11
vii
2.1.1.3 5g core network functions . . . . . . . . . . . . . . . 12
2.1.1.4 OAI core network deployment . . . . . . . . . . . . . 15
2.1.2 5G Radio Access Network Deployment . . . . . . . . . . . . . 17
2.1.2.1 Virtual Implementation using RAN Simulators . . . 17
2.1.2.2 Implementation with USRP Devices and 5G modules 23
2.2 Teleoperation Application . . . . . . . . . . . . . . . . . . . . . . . . 25
2.2.1 Operator Interface . . . . . . . . . . . . . . . . . . . . . . . . 27
2.3 Robot Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
2.3.1 Objective of the Robot Design . . . . . . . . . . . . . . . . . . 29
2.3.1.1 Basic Maneuvering Capability . . . . . . . . . . . . . 30
2.3.1.2 Real-Time Audio/Video Communication . . . . . . . 30
2.3.1.3 Collision Avoidance . . . . . . . . . . . . . . . . . . . 30
2.3.1.4 Robot Design process . . . . . . . . . . . . . . . . . 30
2.3.2 Components Used . . . . . . . . . . . . . . . . . . . . . . . . . 31
2.3.2.1 Raspberry Pi 4B . . . . . . . . . . . . . . . . . . . . 31
2.3.2.2 Pololu 37D Metal Gear Motors . . . . . . . . . . . . 32
2.3.2.3 Dangaya 2.0 Motor Driver . . . . . . . . . . . . . . . 33
2.3.2.4 HC-SR-04 Ultrasonic Sensor . . . . . . . . . . . . . . 33
2.3.2.5 Generic Web Camera . . . . . . . . . . . . . . . . . . 35
2.3.2.6 Voltage regulators . . . . . . . . . . . . . . . . . . . 36
2.3.2.7 SimComm 5G UE Module . . . . . . . . . . . . . . . 37
2.3.3 Building the Robot . . . . . . . . . . . . . . . . . . . . . . . . 38
2.3.3.1 SolidWorks Design . . . . . . . . . . . . . . . . . . . 38
2.3.3.2 PCB Design . . . . . . . . . . . . . . . . . . . . . . . 39
2.3.4 Suggestions for Further Improvements . . . . . . . . . . . . . 41
2.3.4.1 Build a Life-sized Robot . . . . . . . . . . . . . . . . 41
2.3.4.2 Use More Sophisticated Hardware . . . . . . . . . . . 41
2.3.4.3 Test the Robot on a 5G SA Network . . . . . . . . . 41
2.3.5 Alternative Methods . . . . . . . . . . . . . . . . . . . . . . . 41
3 Results 44
3.1 5G network Deployment . . . . . . . . . . . . . . . . . . . . . . . . . 44
3.1.1 Ping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
3.1.1.1 Pining from UE to Ext-dn . . . . . . . . . . . . . . . 44
3.1.1.2 Pinging between UEs . . . . . . . . . . . . . . . . . . 46
3.1.2 Iperf test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
3.2 Teleoperation Application . . . . . . . . . . . . . . . . . . . . . . . . 49
References 52
Appendix 54
viii
List of Figures
ix
3.3 Ping values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
3.4 AMF logs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
3.5 Ping values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
3.6 wireshark analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
3.7 Throughput values . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
3.8 Latency measuring method . . . . . . . . . . . . . . . . . . . . . . . . 49
3.9 Latency comparison between different networks . . . . . . . . . . . . 50
x
List of Tables
xi
Acronyms and Abbreviations
UE - User Equpement
URLLC - Ultra Reliable Low Latency Communication
USRP - Universal Software Radio Peripheral
SDR - Software Defined Radio
IP - Internet Protocol
TCP - Transmission Control Protocol
UDP - User Datagram Protocol
AMF - Access and Mobility Function
UPF - User Plane Function
SMF - Session Management Function
NRF - Network Repository Function
Ext-dn - External Data Network
SDP - Session Description Protocol
ICE - Interactive Connectivity Establishment
xii
1 Introduction
1.1 Nature and the scope of the problem
2. Telepresence - Getting video-audio feedback from the robot hence, the operator
feels present in the remote environment, projecting his or her presence through
the remote robot
Telerobotics is currently used in several fields nowadays. Telerobotic devices are typi-
cally developed for situations or environments that are too dangerous, uncomfortable,
limiting, repetitive, or costly for humans to perform. Telesurgery, Bomb disposal, and
Space exploration are some examples of that.
1
1.1.2 Proposed Solution
To avoid the above-mentioned drawback in this project we are using the 5g technology
to establish the connection between the robot and the control interface. 5g provides
higher data rates than existing mobile networks Due to this reason 5g is a good
solution for high-quality video audio transmission. Since 5g provides higher reliability
and lower latency it is a particularly good solution for the control signal transmission.
Also, 5G provia des higher connection density compared to other networks. Due to
this reason 5g is suitable in use case scenarios where multiple robot units are used.
Network slicing is a unique property of 5G technology which allows the establishment
of different slices for different use cases with different QoS requirements. Using 5G for
telerobotic can be recognized as a UUltra-reliable liable low latency Communication)
use case.
• Network slicing
As per the architecture of our project, there are three types of data transmitted
between the robot and the controller. Which are video, audio signals, and
control signals. 5G technology has introduced a network management feature
called "network slicing" which can facilitate a different type of network traffic
into separate slices with different QoS parameters, so they can be treated in
different ways. This is crucial for the teleoperation scenario as low latency traffic
and high bandwidth traffic can be treated differently with 2 different slices. For
telerobotics, two different slices can be used for teleoperation and telepresence.
For teleoperation we can use an ultra-low latency slice and for while telepresence
2
(video, audio signals) are transmitted a using a high-capacity slice with higher
bandwidth.
In our solution, the robot and the operator will be connected through a 5G network
ensuring low latency and other QoS parameters. The robot will be teleoperated
from a distance but it will contain some sensors which will enable semi-autonomous
navigation. The sensor data will be processed onboard and it will help the teleoperator
to identify and minimize the collisions. The operator will be able to use a Graphical
User Interface which will be developed by us to control the robot. The interface will
output the live video feedback from the camera installed on the robot and will take
input from the user to control the robot. A speaker will be installed on the robot
to communicate with the customer that the robot will be serving. The component
selection for the robot will be discussed in the latter part of the report.
In the end solution, there will be mainly two links between the robot and tele-
operator as indicated in the above figure. A link will be established to send manipula-
tion commands and audio streams from the teleoperator to the robot. A feedback link
from the robot to the operator will also be established to video and audio feedback
3
from the robot to the teleoperator. Video encoding methods will be used to reduce
the bandwidth requirement of the feedback link.
• Reduce latency considering all possible factors (Network latency and processing
latency)
• Telepresence - Getting video-audio feedback from the robot hence, the operator
feels present in the remote environment, projecting his or her presence through
the remote robot
4
1.2 Literature survey
5
network load situations.
6
1.3 Method of investigation
This project can be subdivided into three major sections as follows. For all those
sections we have analyzed possible alternative solutions thoroughly and selected the
best solution suitable for our requirement. This will be discussed further in chapter
2.
• When selecting the software platform for 5G standalone network OAI was se-
lected since it support RAN simulators and it has a huge developer community.
• When selecting a SDR device USRPB B210 was chosen because it is compatible
with OAI core.
7
1.3.3 Tele-operation Web Application
1.3.3.1 WebSockets
• Use websockets for communication instead of peer-to-peer communication.
• In 5G core network and RAN firstly, we were able to connect one UE through a
one gNobeB and take network measurement such as ping and throughput (using
iperf test)
• We were able to connect the web application and the robot through LAN or
other mobile communication networks. We tested it through LAN, 4G network,
and 5G NSA network.
8
2 Methodology
In this chapter, the Methodology used for 5g network deployment, video, and au-
dio transmission, robot implementation, and web application development will be
discussed.
2.1 5G network deployment
Mobile Networks evolved from 1G to 5G adding new features and capabilities in each
generation. The first generation of mobile networks was only capable of voice commu-
nication. 2G mobile networks introduced data services and the core network of the 2G
mobile networks has two separate entities to handle data traffic and voice traffic. 3G
mobile networks enhanced the up-link and down-link capabilities of previous 2G mo-
bile networks. 4G mobile networks were a game-changer in mobile networks as it does
not have 2 separate cores to handle voice and data traffic. It has only a packet core.
Therefore the voice is also transmitted as data packets. There is one similarity in all
of these mobile networks, which that is they all use vendor-specific devices to perform
different core network functions. However, 5G uses Virtual Network Functions in its
core network.
In our project, we used 5G virtual core network functions from OpenAirInterface
Alliances by Eurocom. These virtual core network functions are open source and they
are written in C++ language. Therefore the configurations and debugging were more
flexible.
We deployed the core network functions using Docker. There is a separate docker
container for each core network function. Then the docker containers were networked
using Docker networking techniques. IP addresses were assigned to every docker
container in the IP range of 192.168.70.128/26.
When it comes to the 5G core network deployment there are two main architectures
available[13].
9
• Non Standalone Architecture(NSA)
NSA relies on the 4G network facilities to provide more speed and higher data
bandwidth. A 5G-enabled device will connect to a 5G or 4G network depending
on conditions. When it comes to NSA 5G, the clue is in the name: It’s 5G that
can’t stand on its own in terms of infrastructure. NSA is a 5G radio access
network (RAN) that operates on a legacy 4G LTE core – known as Evolved
Packet Core (EPC) – and manages control plane functions. NSA includes both
a 4G base station (eNodeB) and a 5G base station(gNodeB), but the 4G base
station takes precedence. Because the NR control plane anchors to the EPC,
radio frequency signals forward to the primary 4G base station.
The drawback of NSA 5G, however, is it can’t deliver certain capabilities that
a pure, unfettered SA 5G network can. For example, NSA doesn’t enable the
low latency which is one of the biggest draws to 5G. Due to this reason, NSA
is not suitable for URLLC applications such as telerobotics. Another disadvan-
tage of NSA is it requires a higher level of energy to power 5G networks with
4G infrastructure. 5G NR is more energy-efficient than LTE, IEEE reported,
but using two different forms of cellular technology massively increases power
consumption in a network.
• Standalone architecture(SA)
SA is the true 5G network, where the 5G network has its dedicated 5G facilities
to provide enormous speed improvements and minimal network latency (delay).
The 5G SA network is independent of the 4G network. SA 5G networks include
both a 5G RAN and a cloud-native 5G core, something NSA networks lack and
substitute with a 4G core. SA networks can perform essential 5G functions, such
as reducing latency, improving network performance, and centrally controlling
network management functions, because of their independent 5G cores. Because
of these reasons when it comes to applications such as telerobotics standalone
5G core network is the best candidate.
10
Figure 2.1: Non-standalone Vs. Standalone architecture
11
But due to the pandemic situation since the labs are not open; the third option
which is using a fully virtual network with a software platform was used to deploy the
5g core network. There were several platforms for that, such as an open-air interface,
next EPC, and free 5gc. Among them, OpenAirInterface was chosen because it is an
open-source platform and it has a huge community. Also since OAI 5G network mainly
focuses on deploying a standalone 5g core network, Due to the reasons mentioned
previously it is more suitable for our application.
If we are deploying the network using physical devices, usually what we have
to do is implement the user plane function and control plane functions in 2 virtual
machines. In OpenAirInterface we can also deploy these network functions using
docker containers replacing the requirement of virtual machines.
We deployed the network using the second method. The advantage of This docker
container deployment method is it supports radio access network simulations. In our
project, we have used the following simulators.
• RF simulator
• gnBSim
Network functions connected via Service Bus in the Control Plane form the Service
Based Architecture (SBA) in the 5G Core Network.
12
In SBA, Network Functions (NFs) capabilities are exposed via REST APIs and
based on HTTP2.0 protocol. Interconnection between NFs can be based on the Re-
quest/Response model or Subscribe/Notify model for availing of the different 5G
Services[14].
following are the main functions that are used in the 5G standalone core network[15].
13
ing, Packet inspection, Policy enforcement for User plane, QoS handling, etc.
When compared with 4G EPC, its functionalities resemble SGW-U (Serving
Gateway User Plane function) and PGW-U (PDN Gateway User Plane func-
tion) of 4G Network.
14
• AF (Application Function)
It performs operations like accessing Network Exposure Function for retrieving
resources, interacting with PCF for Policy Control, exposing services to End
users, etc. When compared with 4G EPC, its functionalities resemble with AF
of the 4G Network.
As mentioned in the previous section OAI was used to deploy a virtual core network.
Each of the above-mentioned core network functions is available as docker images.
Each of the functions has there own repository. They have to be deployed in Docker
containers and each function has a unique IP address that allows the communica-
tion between functions. Some repositories related to network slicing such as NSSF,
NWDAF(Network data analytic function), UDSP (Unstructured data storage func-
tion), and VPP (Vector packet processing) based UPF are private since they are
still in development. Therefore some functionalities such as network slicing are not
achievable until they are published.
Figure 2.3 shows 5g core network functions we have deployed using OAI architecture.[1]
Following docker containers were deployed in order to implement the core network,
• OAI-AMF - 192.168.71.132
• OAI-SMF - 192.168.71.133
• OAI-NRF - 192.168.71.130
• OAI-EXTdn - 192.168.71.135
15
Figure 2.3: OpenAirInterface 5G core network architecture
16
2.1.2 5G Radio Access Network Deployment
Radio Access Network is essential in any mobile network and it allows user equipment
to connect to the network through a radio interface. In our project, we have two user
equipment. One user equipment is the laptop or a 5G enable mobile device that runs
the teleoperation application and the other user equipment is the robot itself. There
were 2 available options that allow implementation of the radio access network,
17
Figure 2.4: Connecting 2 virtual UEs to core network through virtual gNobeBs
As in the above figure, we host the core network functions in one computer and then
connect two other computers using a LAN connection. We host a pair of virtual
gNodeB and virtual UE in these two computers. Following configurations are needed
for proper connectivity.
when configuring the gNodeB we have to match the following parameters in the
Access and Mobility Function (AMF) with the gNodeB configuration file as shown in
figure 2.5.
• gNodeB ID
18
Figure 2.5: gNodeB configuration file
In addition to that, we have to put the IP address of the host machine in which
we deployed the virtual gNodeB in the following section of the gNodeB configuration
file as shown in figure 2.6.
19
Figure 2.6: Network interfaces in gNobeB
After successfully configuring the gnb configuration file, we can execute the fol-
lowing command to turn on the gNodeB.
20
Connecting and Starting the UE
• key
• opc
after successfully configuring the UE we can use the following command to turn on
the virtual UE on RFSimulator mode.
21
The virtual UE automatically gets attached to the gNodeB when the RFSimulator
mode is on. Access Mobility Function(AMF) of the core network identifies the UE as
soon as it attaches to the gNodeB. Therefore by analyzing the log files of the AMF
we can confirm that the UE is connected to the core network through the gNodeB.
We deployed 2 pairs of virtual gNodeB and virtual UE and both gNodeBs con-
nected to the core network as shown in the architecture diagram. We used this setup
for testing and to take network measurements. A picture of the real setup is shown
below in figure 2.11.
22
2.1.2.2 Implementation with USRP Devices and 5G modules
As mentioned earlier we have 2 ends. One end is the robot and the other end is the
teleoperation application. We only have one USRP B210 device and one Simcomm
5G module available at the Dialog Research Lab. They are expensive devices and
we need the approval and special clearances needed to import them to Sri Lanka as
they are 5G devices. Therefore our initial plan was to deploy 2 gNodeBs whereas
one gNodeB is deployed using USRP B210 and the other gNodeB using a simulator.
Then we connect the robot to the USRP gNodeB using the simcomm 5G module
and run the application on a laptop that is connected to the simulated gNodeB. The
Following Diagram shows the architecture.
23
Figure 2.13: Setup with USRP module
24
2.2 Teleoperation Application
We have designed a web interface to view the video feed from the robot and to control
the robot We design a web application using the webRTC framework. WebRTC stands
for web real-time communication and in webRTC, there are no servers involved to
transfer real-time data from one client to another. The webRTC connections are
peer-to-peer and because of this sending data through the webRTC is very simple all
of this is done through webRTC APIs which are available for almost all the platforms.
when I said that there are no servers required for webRTC I was not completely right
about it. There are servers required initially for the client computers to get connected
but once the connection is established then these clients can communicate with each
other without the need of having a server.
What happens is this operator end will send some data about itself to the signal-
ing server and then the signaling server will send that data to the robot end and then
this client will store this data after that robot side client will send data about itself
to the server and then the server will send that data to the operator side. And now
both these client computers know about the media configurations of each other but
they still do not know how to connect and for that they will have to transfer their
network information. The network configuration data that needs to be transferred
for the connection to happen is something called ICE candidates.ICE candidates are
generated from something called stun and turn servers and now to be able to get the
ICE candidates from the stun and turn servers for the client what it needs to do is it
has to provide the URL of the stun and turn servers to the webrtc’s API and then as
soon as this client creates an offer the webrtc’s API will start to get ICE candidates
25
from the stun and the turn servers and now the job for this client is to send this
ICE candidate to the other client. Therefore, it will send the ICE candidate to the
server first and then the server will send that candidate to the client too. Now both
clients have network information about each other. Therefore, the connection will
get established between them and now they can send their data without the need of
having a server.
And we designed our own turn server to use for our application because the ex-
isting turn servers did not work properly. They got a significant delay to establish
the connection. We use the DigitalOcean cloud platform to host our turn server.
26
Figure 2.16: TURN Server
As mentioned in Figure (2.14) in the robot end two threads were used, one for con-
trolling actuators and the other one for autonomous braking. This will be explained
in detail in the next chapter.
27
control the robot.
In the Top left side window, it shows a video stream of the person who controls the
robot. We can minimize this window according to the user’s preference and also the
user can mute audio and stop video according to his preference. The larger window on
the right side integrates the video stream of the robot. Also, there are commands in
this web application and when we are pressing the corresponding key on the keyboard
we can control the moment of the robot.
28
2.3 Robot Design
2. Tele operability
29
3. Real-time Audio/Video communication
30
Table 2.2: Component Selection
Type Component Reason to Choose
Robot controller + Communication Module Raspberry Pi 4 B Performance and availability
Motors Pololu 37D Metal Gear Motors with Encoders Performance and availability
Motor Driver Dangaya 2.0 Motor Driver Compatibility and Availability
Distance Measurement Sensor HC-SR-04 Ultrasonic Sensor Low cost
Camera and Mic Generic Web Cam Low cost and availability
Power Source 11.1V 2200mAh Li-Po Battery Availability
Voltage regulators Buck booster and Buck converter Availability
Backup Communication module SimCom 5G UE Module Availability
Backup Power Source Mi 10000mAh power bank Availability
• 2.4 GHz and 5.0 GHz IEEE 802.11ac wireless, Bluetooth 5.0, BLE
• Gigabit Ethernet
• Micro-SD card slot for loading operating system and data storage
31
Figure 2.19: Raspberry Pi 4B Single Board Computer
32
2.3.2.3 Dangaya 2.0 Motor Driver
Dangaya 2.0, officially Aptinex Dangaya 2.0 is a VNH5019 based dual motor driver.
The operating voltage of this module is from 5.5V to 24V while providing a continuous
current supply of 12A per channel. This motor driver is compatible with the Pololu
37D motors which demand a 12V input.
33
Figure 2.22: HC-SR04 Ultrasonic Sensor
The advantage of using this module is this it is simple to set up and use. Fur-
thermore, this module does not need additional hardware or additional processing
overhead to operate. The main disadvantage of this module is, that its optimal dis-
tance range is limited unlike ToF sensors and LiDAR sensors.
34
Figure 2.24: Effective Distance Measurement Range for Ultrasonic Sensors
35
2.3.2.6 Voltage regulators
In this robot design, two types of voltage regulating modules were used. A 120W
buck-boost converter and two XL4015 buck converters. The buck converters were
used to step down the voltage to operatable levels. One was used to get a 5V output
and the other was used to get a 12V output. The motors used in the robot demand
a 12V input voltage, even though the output voltage of the battery is only 11.1V. To
address this issue, a 120W buck-boost converter was used. Which the help of this
module, the battery’s output voltage was boosted up to 15V and stepped down back
to 12V, which is the suitable voltage for the motors.
36
2.3.2.7 SimComm 5G UE Module
This module was used as a backup communication module for the robot. By using
this module, the robot can be connected directly to a base station. The configurations
of the SimCom 5G module are done through AT commands, which can be used to
connect the device to the 5G base station upon availability. The following diagram
shows a serial monitor that communicates with the SimCom module using AT com-
mands. The first command which is AT+COPS=? Shows the available networks
that can be connected through the module. The last integer parameter shows what
type of network it is.
37
Figure 2.29: Communicating with SimComm 5G UE Module with AT Commands
through a Serial Monitor
38
Figure 2.30: CAD Drawing of the Robot
The final robot design for the demonstration of the application is as follows. The
robot operates with a multi-threaded setup. One thread for control command and the
other thread for autonomous breaking. When the control command JSON is received
by the web app, it sends the control command JSON to the control command thread
through a Flask App. The other thread continuously measures the distance to any
obstacles ahead and applies break if any obstacle is detected within 20cm.
39
Figure 2.31: Final Design of the Robot
40
2.3.4 Suggestions for Further Improvements
2.3.4.1 Build a Life-sized Robot
By building a life-sized robot, it can be tested in an actual restaurant environment
rather than operating in a small arena. This will help to identify things that can be
further improved when it comes to an actual operating environment.
• Webots
41
• Gazebo
After considering the pros and cons of Webots and Gazebo, to simulate the robot and
the restaurant environment, an open-source simulator, Webots was chosen. A simple,
yet complete environment of a restaurant is designed from scratch using Webots. The
environment has a few dining tables, chairs, a dining area, and a pantry area. To
demonstrate the functionality of the robot, a simple robot design was used. The
robot consists of 2 driving wheels, 1 passive wheel, a camera, and a platform to carry
the food. The robot can be controlled using a keyboard, and at the moment it gives
the camera feedback into a new window. Later, this is to be integrated into the
web app and the video feedback and robot controls are to be displayed in it. The
42
Figure 2.34: Robot Design
using WebRTC in the later stages. The main drawback of this method was, that
the performance of the robot simulation was heavily dependent on the performance
of the hardware platform used to run the simulation. The bottleneck of this method
was the performance of the laptop that the simulation was run.
43
3 Results
In this chapter, we discussed intermediate results. analysis of requirements, and
limitations of implementation.
3.1 5G network Deployment
We proceeded with the fully virtual setup discussed in a previous section to take
network measurements and perform other tests. We used well known network mea-
surement tools such as “iperf”, “iperf3” and “ping” for taking the values. Wireshark
was also used to analyse the packets. We used log file of AMF( Access and Moibil-
ity Function) for debugging purposes.It provides a high level understanding and live
status report about the network.
3.1.1 Ping
3.1.1.1 Pining from UE to Ext-dn
We used the ping network measurement tool simultaneously in both UEs and pinged
to EXT-DN block (192.168.72.135) which represents the external data network. An
Average ping value of 38ms was reported in the first UE and the second UE reported
an average ping value of 44ms. Screenshots of the UEs and a graph representing pings
values are shown below.
44
Figure 3.2: Ping from UE2 to Ext-dn
45
3.1.1.2 Pinging between UEs
The following figure shows a screenshot of AMF logs taken when 2 UEs are connected
to the core network through 2 gnodeBs. It confirms that the proposed architecture
has been deployed.
We have configured the Session Management Function (SMF) of our core network
to assign IPs for the UEs in the range of 12.1.1.2 -12.1.1.127. Therefore when we
connect two virtual UEs to the core network one UE was assigned with 12.1.1.2 IP
address and the other UE was assigned with 12.1.1.3 IP address. To measure the
connectivity and round trip time between the UEs we ping from 12.1.1.3 to 12.1.1.2
.Following figure shows the execution. We reported an average round trip time of
738ms
46
We used Wireshark to capture packets in the core network while pinging from
one UE to another. The following screenshot of Wireshark shows that packets from
12.1.1.3 to 12.1.1.2 traverse through the 5G core network. It indicates that ping
requests and replies both go through the core network. In addition to that, commu-
nication between different core network functions can also be seen.
47
Figure 3.7: Throughput values
These values heavily depend on the performance of hardware. Open Air Interface
5G core network also has its own limitations. The official website of the software
platform indicates that the current version of OAI doesn’t fully support the URLLC
use case. However, these value can be further improved with the newer version of
OAI and more capable hardware.
48
3.2 Teleoperation Application
The web application and the robot can be connected through LAN or other mobile
communication networks. We tested it through LAN, 4G network, and 5G NSA
network. We measured the latency to transfer the control commands from operator
to robot end by controlling the robot in different kinds of networks. We used our
method for measuring these delay values. The following diagram shows how we
measure latency.
A simple method to measure the latency in the data channel was implemented. The
data channel is used to transmit the control commands from the operator end to the
robot end. First, the time is logged (say t1 ) from the operator end when the control
command is issued. When the control command is received at the robot end, it
transmits back the control command to the robot end. When the control command is
received back to the operator, the time is logged again (say t2 ). Then the transmission
delay (say td ) is calculated as,
(t2 − t1 )
td =
2
49
This calculation was carried out based on two assumptions.
2. The time taken to send back the control signal at the robot end is negligible
compared to the transmission delay.
Using this method we measured the delay values for different networks. As men-
tioned earlier these are the delay values to transfer the control commands from the
operator to the robot end. The latency for receiving the video feed from the robot is a
little bit different than these values. The following graph shows a latency comparison
between different networks.
The graph indicates that the highest latency is observed in the 4G network. The aver-
age delay value is around 42 ms. And the lowest latency is obtained in the local area
network which has an average value of 6ms. And from the NSA 5G network, we have
observed average latency valued around 24 ms. Due to the hardware performance of
our computers and the signal quality of the 5G network this 5G network delay value
is higher than the expected delay values. We hope that if possible we can overcome
these challenges we will be able to achieve a latency that is lower than 20ms.
50
4 Discussion and Conclusion
The Tele Robotics Through 5G project is meant to explore the possibilities of using
5G technology to operate a robot at a remote location. The main goal of the project
is to demonstrate the Ultra-Reliable Low Latency Communication (URLLC) use case
of 5G mobile networks. To this end, our group has developed a Miniature Robot that
can be controlled through any mobile network (from 2G-5G), a web application to
remote operate the robot, and a virtual 5G standalone core network.
Our initial strategy was to connect the robot to the virtual 5G standalone core
network using the USRP devices. However, due to compatibility issues, this attempt
failed as discussed in the above section. Therefore we connected to our robot and the
web application through the non-standalone 5G network by Dialog which is available
at the university premises. We were able to manipulate the robot remotely using this
5G non-standalone network with significantly lower delay values. The high bandwidth
was also experienced and it provided smooth video feedback from the robot to the
operator. However, these values we got from the 5G non-standalone network are not
good enough to demonstrate the URLLC use case of 5G because URLLC is about
achieving latency values as low as 1ms.
Although we couldn’t connect the robot to the standalone 5g core network de-
veloped by ourselves, we were able to get network measurements. We were able to
measure the round trip time and bandwidth when two user equipment communicate
with each other through our 5G standalone core network. We were able to success-
fully establish the connection between 2 UEs after days of learning about 5G core
network functions. We gained a significant understanding of how 5G core network
functions should be configured and what each of the core network functions does.
Finally, we would like to conclude that the project is a success. The project
covered 3 totally different areas of technology by building a robot, building a web
application, and building an end-to-end virtual 5G standalone network. All these
were built within a time period of one year. The project can be improved in the
coming years as well. From our point of view, the future work of this project should
be focused to connect the robot and the application through the 5G standalone core
network using USRP devices.
51
References
[1] L. Bertinetto, J. Valmadre, J. F. Henriques, A. Vedaldi, and P. H. S. Torr,
“Fully-convolutional siamese networks for object tracking,” 2016.
[4] P. A. Chou, A. E. Mohr, A. Wang, and S. Mehrotra, “Error control for receiver-
driven layered multicast of audio and video,” IEEE Transactions on multimedia,
vol. 3, no. 1, pp. 108–122, 2001.
[5] D. Wu, Y. T. Hou, and Y.-Q. Zhang, “Transporting real-time video over the
internet: Challenges and approaches,” Proceedings of the IEEE, vol. 88, no. 12,
pp. 1855–1877, 2000.
[7] Z. Ren, M. Liu, C. Ye, and H. Shao, “The real time video transmission system
based on h. 264,” in 2009 International Conference on Web Information Systems
and Mining. IEEE, 2009, pp. 270–274.
[8] H. Holma and A. Toskala, LTE for UMTS: OFDMA and SC-FDMA based radio
access. John Wiley & Sons, 2009.
[9] A. Avizienis, J.-C. Laprie, B. Randell, and C. Landwehr, “Basic concepts and tax-
onomy of dependable and secure computing,” IEEE transactions on dependable
and secure computing, vol. 1, no. 1, pp. 11–33, 2004.
[10] Z. Shi, H. Zou, M. Rank, L. Chen, S. Hirche, and H. J. Muller, “Effects of packet
loss and latency on the temporal discrimination of visual-haptic events,” IEEE
Transactions on Haptics, vol. 3, no. 1, pp. 28–36, 2009.
52
[12] M. Eid, J. Cha, and A. El Saddik, “Admux: An adaptive multiplexer for haptic–
audio–visual data communication,” IEEE Transactions on Instrumentation and
Measurement, vol. 60, no. 1, pp. 21–31, 2010.
53
Appendix
54
Figure 4.2: PCB layout
55