Professional Documents
Culture Documents
Issue 01
Date 2020-03-30
and other Huawei trademarks are trademarks of Huawei Technologies Co., Ltd.
All other trademarks and trade names mentioned in this document are the property of their respective
holders.
Notice
The purchased products, services and features are stipulated by the contract made between Huawei and
the customer. All or part of the products, services and features described in this document may not be
within the purchase scope or the usage scope. Unless otherwise specified in the contract, all statements,
information, and recommendations in this document are provided "AS IS" without warranties, guarantees
or representations of any kind, either express or implied.
The information in this document is subject to change without notice. Every effort has been made in the
preparation of this document to ensure accuracy of the contents, but all statements, information, and
recommendations in this document do not constitute a warranty of any kind, express or implied.
Website: https://www.huawei.com
Email: support@huawei.com
Contents
1 Change History.........................................................................................................................1
1.1 SRAN16.1 01 (2020-03-30)..................................................................................................................................................1
1.2 SRAN16.1 Draft A (2020-01-20)........................................................................................................................................ 1
4 Technical Description..............................................................................................................9
4.1 Basic Concepts.......................................................................................................................................................................... 9
4.1.1 Performance Counter......................................................................................................................................................... 9
4.1.2 Measurement Object........................................................................................................................................................ 10
4.1.3 Function Subset.................................................................................................................................................................. 11
4.1.4 Function Set........................................................................................................................................................................ 11
4.1.5 Measurement Period........................................................................................................................................................ 12
4.2 Performance Counter Statistical Methods................................................................................................................... 13
4.2.1 Performance Counter Statistical Types...................................................................................................................... 13
4.2.2 Performance Counter Aggregation Types................................................................................................................. 14
4.3 Counter-based Performance Measurement Process................................................................................................. 14
4.4 Performance Counter Help Documents........................................................................................................................ 18
4.5 Northbound Interface.......................................................................................................................................................... 18
4.5.1 Northbound Interface Overview.................................................................................................................................. 18
4.5.2 Recommended Interconnection Solution.................................................................................................................. 19
4.5.3 Related Documents...........................................................................................................................................................19
4.6 Schemes for Adding or Deleting Performance Counters and Object Type Parameters............................... 20
5 Glossary................................................................................................................................... 22
6 Reference Documents...........................................................................................................23
1 Change History
Technical Changes
None
Editorial Changes
Revised descriptions in the document.
Technical Changes
Change Description Parameter Change
Editorial Changes
None
This document only provides guidance for feature activation. Feature deployment and
feature gains depend on the specifics of the network scenario where the feature is
deployed. To achieve the desired gains, contact Huawei professional service engineers.
Software Interfaces
Any parameters, alarms, counters, or managed objects (MOs) described in Feature
Parameter Description documents apply only to the corresponding software
release. For future software releases, refer to the corresponding updated product
documentation.
For definitions of base stations described in this document, see section "Base
Station Products" in SRAN Networking and Evolution Overview.
TDLBFD-004008 Performance
Management
MLBFD-12000408 Performance
Management
3.1 Definition
Performance management is a function defined by the Telecommunication
Management Network (TMN). This feature provides a method of efficiently
monitoring network performance, while facilitating network evaluation,
optimization, and troubleshooting.
3.2 Benefits
Performance management helps you:
● Check whether product-related features are effective.
● Detect networks whose performance deteriorates, enabling telecom operators
to take measures to improve network quality.
● Troubleshoot network faults and provide suggestions on network quality
improvement.
● Monitor and optimize wireless and transport networks. This improves user
experience and helps telecom operators maximize the usage of existing device
resources.
● Offer network planning engineers detailed information required for network
expansion.
3.3 Architecture
Performance management includes measurement management, measurement
data collection, real-time performance monitoring, and performance threshold
alarms. The NEs, OSS, MAE-Evaluation, and NMS jointly provide these functions.
Figure 3-1 shows the architecture of performance management.
NEs
NEs in the performance management architecture include:
● BSC and eGBTS on the GSM network
● RNC and NodeB on the UMTS network
● Macro and other eNodeBs on the LTE network
● gNodeB on the NR network
● MBSC and multimode base station on the SingleRAN network
After receiving performance measurement result subscription from the MAE, NEs
collect performance counters indicating the performance of physical and logical
If the NE reset period includes the end time of a measurement period, the NE cannot report
the measurement result file. This will lead to the absence of performance counter
measurement data between the start of the current measurement period and the
restoration of an NE reset. When the next measurement period ends after the NE reset, the
NE can normally report the measurement result file.
MAE
The MBB Automation Engine (MAE) is the platform to execute performance
management. Users can register performance measurement tasks, and process
and monitor performance data through the graphical user interface (GUI). The
MAE provides the following performance management functions:
MAE-Evaluation&NMS
After obtaining the performance data from the MAE, the MAE-Evaluation parses
and stores the data in the mAOS database and regularly aggregates the data. The
mAOS can store key data for an extended period, query data flexibly in multiple
dimensions, and manage, generate, and distribute reports.
After NEs send performance MRFs to the MAE, the NMS can obtain the files from
specified directories on the MAE.
For details of the MAE-Evaluation, see RAN Statistics Performance Visibility in
MAE-Evaluation Product Documentation.
4 Technical Description
Each performance counter has a unique ID and name. Table 4-1 describes an
example performance counter.
ID Name Description
Method 2: Add filter criteria to base counters (common performance counters and
extended performance counters). Counters that are defined by using this method
are also called event-based counters (EBCs). The common types of EBCs are as
follows: accessibility, retainability, mobility, delay, and cell resource. Each type
includes multiple Base Counters. For example, Number of RRC Connection
Establishment Attempts (Excluding Duplicated Attempts) is a base counter.
You can set the Establishment Cause filter criterion to RRC_EMERGENCY for this
base counter and create an EBC to measure the number of RRC connection
establishment attempts in emergency calls. The default measurement periods of
EBCs do not map the supported measurement periods of base counters. Currently,
EBCs support only the 15-minute measurement period while base counters
support short and long measurement periods. For details about the short and long
measurement periods, see 4.1.4 Function Set. EBC collects statistics based on the
call granularity events recorded by NEs. EBC statistics are collected on the OSS
based on the CHRs reported by NEs through filter criteria such as access causes,
failure causes, service characteristics, and operator IDs.
● After you start a measurement task for user-defined performance counters on the OSS,
the OSS automatically enables the measurement of common performance counters and
extended performance counters involved in the user-defined performance counters.
● eNodeBs support the EBC function as of eRAN3.0. RNCs support the EBC function as of
RAN15.0. The measurement period is 15 minutes.
● A gNodeB does not support the EBC function.
Performance counters are measured for an object. Table 4-2 lists the example
object types of some NodeB performance counters.
Figure 4-1 Relationships between object types, function sets, function subsets, and
performance counters
eGBTS 15 and 60 60
NodeB 15 and 60 60
NR gNodeB 15 and 60 60
Measurement periods are classified into short measurement periods (5- and 15-minute)
and long measurement periods (15-, 30-, and 60-minute). The short and long measurement
periods cannot be set to 15 minutes concurrently.
mediation service of the MAE server parses the packets reported by NEs and
converts them into data of a uniform format. The performance service saves the
data required by users to the performance database. Through the northbound
interface of the MAE, the NE performance data can be sent to the NMS for
northbound users to analyze.
Figure 4-2 shows the performance measurement process.
The following table describes the basic process of EBC-based MAE performance
measurement.
4 Report CHR eNodeBs and RNCs report CHR files to which the EBC
files subscribes to the Trace Server and SAU.
4 Calculate The Trace Server and SAU use the counter-defined script
counters files to process CHR files and generate counter statistical
results.
The MAE opens the northbound interface for performance files to support the
capability of integrating with the NMS manufactured by a third party. It is
recommended that the northbound interface for performance files adopt the XML
format, which is defined by 3GPP protocols for performance measurement result
files. File export delay through this interface is short and the MAE is least affected.
The file interface can periodically export performance data from the database to
files based on preset export criteria. The NMS can obtain the exported
performance files from a specified path on the MAE by using the FTP/SFTP. If the
NMS provides the IP address of a server and specifies a path on the server, the
MAE can upload files to the path on the server.
For details of the northbound performance file interface, see MAE Northbound
Performance File Interface Developer Guide (NE-Based).
Counter names are longer that counter IDs and may vary based on NE versions.
If counter names are used to interconnect the MAE with the NMS, the size of a
performance result file on the northbound interface is large, which affects file transfer and
parsing efficiency.
Table 4-6 Schemes for deleting performance counters and object type parameters
Scenario Scheme (N-N+1) Scheme
(N+2)
Old object type ● Old object type parameters are reserved and Old
parameters are reported. The parameter value validity performan
deleted or depends on the validity of values for the ce
replaced by corresponding parameters in the counters
other parameters configuration model. are
due to ● New object type parameters are reported deleted.
configuration properly.
model
optimization.
Table 4-7 lists the documents containing information about deleted performance
counters and object type parameters.
5 Glossary
6 Reference Documents