You are on page 1of 4

Converged optical, wireless, and data center network infrastructures for 5G

services

Abstract:
This paper focuses on converged access/metro infrastructures for 5G services,
proposing a common transport network integrating wireless and optical network
segments with compute/storage domains. To identify the optimal mix of transport
network technologies (optical/wireless) and processing modules that are required
to support 5G services in a cost- and energy-efficient manner, a two-stage
optimization framework is proposed. In the first stage, a multi-objective
optimization scheme, focusing on the transport network segment, tries to jointly
minimize the capital expenditure of the converged 5G network. This is performed
through the identification of the optimal mix of wireless and optical transport
network technologies. The second stage focuses on the compute network segment
and aims at identifying suitable processing modules to which operational 5G
services need to be allocated. The performance of the proposed approach is
examined using realistic traffic statistics for various network technology choices
including mmWave and passive optical networks (PONs) for transport, fixed, and
elastic grid optical networks across a city-wide topology in Bristol, UK.

A Data-Driven Multiobjective Optimization Framework for Hyperdense 5G


Network Planning
Abstract:
The trials and rollout of the fifth generation (5G) network technologies are
gradually intensifying as 5G is positioned as a platform that not only
accommodates exploding data traffic but also unlocks a multitude use cases,
services and deployment scenarios. However, the need for hyperdense 5G
deployments is revealing some of the limitations of planning approaches that
hitherto proved adequate for pre-5G systems. The hyper densification envisioned
in 5G networks not only adds complexity to network planning and optimization
problems, but underlines need for more realistic data-driven approaches that
consider cost, varying demands and other contextual attributes to produce feasible
topologies. Furthermore, the quest for network programmability and automation
including the 5G radio access network (RAN), as manifested by network slicing
technologies and more flexible RAN architectures, are also among other factors
that influence planning and optimization frameworks. Collectively, these
deployment trends, technological developments and evolving (and diverse) service
demands point towards the need for more holistic frameworks. This article
proposes a data-driven multi objective optimization framework for hyperdense 5G
network planning with practical case studies used to illustrate added value
compared to contemporary network planning and optimization approaches.
Comparative results from the case study with real network data reveal potential
performance and cost improvements of hyperdense optimized networks produced
by the proposed framework due to increased use of contextual data of planning
area and focus on objectives that target demand satisfaction
Intelligent network data analytics function in 5G cellular networks using
machine learning

Abstract:
5G cellular networks come with many new features compared to the legacy
cellular networks, such as network data analytics function (NWDAF), which
enables the network operators to either implement their own machine learning
(ML) based data analytics methodologies or integrate third-party solutions to their
networks. In this paper, the structure and the protocols of NWDAF that are defined
in the 3rd Generation Partnership Project (3GPP) standard documents are first
described. Then, cell-based synthetic data set for 5G networks based on the fields
defined by the 3GPP specifications is generated. Further, some anomalies are
added to this data set (e.g., suddenly increasing traffic in a particular cell), and then
these anomalies within each cell, subscriber category, and user equipment are
classified. Afterward, three ML models, namely, linear regression, long-short term
memory, and recursive neural networks are implemented to study behaviour
information estimation (e.g., anomalies in the network traffic) and network load
prediction capabilities of NWDAF. For the prediction of network load, three
different models are used to minimize the mean absolute error, which is calculated
by subtracting the actual generated data from the model prediction value. For the
classification of anomalies, two ML models are used to increase the area under the
receiver operating characteristics curve, namely, logistic regression and extreme
gradient boosting. According to the simulation results, neural network algorithms
outperform linear regression in network load prediction, whereas the tree-based
gradient boosting algorithm outperforms logistic regression in anomaly detection.
These estimations are expected to increase the performance of the 5G network
through NWDAF.

You might also like