You are on page 1of 227

O-RAN.TIFG.E2E-Test.0-v04.

00
Technical Specification
1

O-RAN Test and Integration Focus Group

End-to-end Test Specification

Copyright © 2022 by the O-RAN ALLIANCE e.V.


The copying or incorporation into any other work of part or all of the material available in this specification in any form without the prior
written permission of O-RAN ALLIANCE e.V. is prohibited, save that you may print or download extracts of the material of this specification
for your personal use, or copy the material of this specification for the purpose of sending to individual third parties for their information
provided that you acknowledge O-RAN ALLIANCE as the source of the material and that you inform the third party that these conditions
apply to them and that they must comply with them.

O-RAN ALLIANCE e.V., Buschkauler Weg 27, 53347 Alfter, Germany


Register of Associations, Bonn VR 11238, VAT ID DE321720189
O-RAN.TIFG.E2E-Test.0-v04.00

1 Revision History
Date Revision Description

2021.03.06 01.00 Initial version

2021.07.09 02.00 Additional tests for Ch. 4-6, new Ch. 8 for load/stress tests incl. Ch. 3 updates

2022.03.31 03.00 Incorporated VIA-0002 adding Ch. 9 RIC-enabled traffic steering use case test

2022.07.31 04.00 Incorporated CRs NOK-0008 VIA-0001 SAM-0001

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 2
O-RAN.TIFG.E2E-Test.0-v04.00

1 Contents
2 Revision History................................................................................................................................................. 2
3 Introductory Material ............................................................................................................. 11
4 1.1 Scope of the document ...................................................................................................................................... 11
5 1.2 References ......................................................................................................................................................... 11
6 1.3 Definitions and Abbreviations........................................................................................................................... 12
7 1.3.1 Definitions ................................................................................................................................................... 12
8 1.3.2 Abbreviations ............................................................................................................................................... 13
9 1.4 Revision Guideline ............................................................................................................................................ 14
10 Overview ................................................................................................................................ 15
11 Testing methodology and configuration................................................................................. 16
12 3.1 System under test............................................................................................................................................... 17
13 3.1.1 Load and Stress Interfaces to SUT .............................................................................................................. 18
14 3.2 Test and measurement equipment and tools...................................................................................................... 18
15 3.3 Test report.......................................................................................................................................................... 20
16 3.4 Data traffic ......................................................................................................................................................... 21
17 3.5 Mobility Classes ................................................................................................................................................ 22
18 3.6 Radio conditions ................................................................................................................................................ 22
19 3.7 Inter-cell interference ........................................................................................................................................ 24
20 3.8 Spectral efficiency ............................................................................................................................................. 25
21 Functional tests ....................................................................................................................... 26
22 4.1 LTE/5G NSA attach and detach of single UE ................................................................................................... 26
23 4.1.1 Test description and applicability ................................................................................................................ 26
24 4.1.2 Test setup and configuration ........................................................................................................................ 27
25 4.1.3 Test Procedure ............................................................................................................................................. 27
26 4.1.4 Test requirements (expected results) ........................................................................................................... 28
27 4.2 LTE/5G NSA attach and detach of multiple UEs ............................................................................................. 29
28 4.2.1 Test description and applicability ................................................................................................................ 29
29 4.2.2 Test setup and configuration ........................................................................................................................ 29
30 4.2.3 Test Procedure ............................................................................................................................................. 30
31 4.2.4 Test requirements (expected results) ........................................................................................................... 30
32 4.3 5G SA registration and deregistration of single UE .......................................................................................... 31
33 4.3.1 Test description and applicability ................................................................................................................ 31
34 4.3.2 Test setup and configuration ........................................................................................................................ 32
35 4.3.3 Test Procedure ............................................................................................................................................. 32
36 4.3.4 Test requirements (expected results) ........................................................................................................... 33
37 4.4 Intra-O-DU mobility.......................................................................................................................................... 33
38 4.4.1 Test description and applicability ................................................................................................................ 33
39 4.4.2 Test setup and configuration ........................................................................................................................ 34
40 4.4.3 Test Procedure ............................................................................................................................................. 35
41 4.4.4 Test requirements (expected results) ........................................................................................................... 35
42 4.5 Inter-O-DU mobility.......................................................................................................................................... 36
43 4.5.1 Test description and applicability ................................................................................................................ 36
44 4.5.2 Test setup and configuration ........................................................................................................................ 36
45 4.5.3 Test Procedure ............................................................................................................................................. 38
46 4.5.4 Test requirements (expected results) ........................................................................................................... 38
47 4.6 Inter-O-CU mobility .......................................................................................................................................... 39
48 4.6.1 Test description and applicability ................................................................................................................ 39

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 3
O-RAN.TIFG.E2E-Test.0-v04.00

1 4.6.2 Test setup and configuration ........................................................................................................................ 39


2 4.6.3 Test Procedure ............................................................................................................................................. 40
3 4.6.4 Test requirements (expected results) ........................................................................................................... 41
4 4.7 Registration and deregistration to a single network slice .................................................................................. 42
5 4.7.1 Test description and applicability ................................................................................................................ 42
6 4.7.2 Test setup and configuration ........................................................................................................................ 42
7 4.7.3 Test Procedure ............................................................................................................................................. 42
8 4.7.4 Test requirements (expected results) ........................................................................................................... 43
9 4.8 Registration and deregistration to multiple network slices ............................................................................... 44
10 4.8.1 Test description and applicability ................................................................................................................ 44
11 4.8.2 Test setup and configuration ........................................................................................................................ 45
12 4.8.3 Test Procedure ............................................................................................................................................. 45
13 4.8.4 Test requirements (expected results) ........................................................................................................... 46
14 4.9 Idle Mode Intra-O-DU mobility ........................................................................................................................ 48
15 4.9.1 Test description and applicability ................................................................................................................ 48
16 4.9.2 Test setup and configuration ........................................................................................................................ 48
17 4.9.3 Test Procedure ............................................................................................................................................. 48
18 4.9.4 Test requirements (expected results) ........................................................................................................... 49
19 4.10 Idle mode Inter-O-DU mobility ........................................................................................................................ 50
20 4.10.1 Test description and applicability ................................................................................................................ 50
21 4.10.2 Test setup and configuration ........................................................................................................................ 50
22 4.10.3 Test Procedure ............................................................................................................................................. 51
23 4.10.4 Test requirements (expected results) ........................................................................................................... 52
24 4.11 Idle mode Inter-O-CU mobility ......................................................................................................................... 53
25 4.11.1 Test description and applicability ................................................................................................................ 53
26 4.11.2 Test setup and configuration ........................................................................................................................ 53
27 4.11.3 Test Procedure ............................................................................................................................................. 54
28 4.11.4 Test requirements (expected results) ........................................................................................................... 55
29 4.12 5G/4G Inter-RAT Mobility - 5G to LTE handover ........................................................................................... 55
30 4.12.1 Test description and applicability ................................................................................................................ 55
31 4.12.2 Test setup and configuration ........................................................................................................................ 56
32 4.12.3 Test Procedure ............................................................................................................................................. 56
33 4.12.4 Test requirements (expected results) ........................................................................................................... 57
34 4.13 5G/4G Inter-RAT Mobility - LTE to 5G handover ........................................................................................... 57
35 4.13.1 Test description and applicability ................................................................................................................ 57
36 4.13.2 Test setup and configuration ........................................................................................................................ 57
37 4.13.3 Test Procedure ............................................................................................................................................. 58
38 4.13.4 Test requirements (expected results) ........................................................................................................... 58
39 Performance tests ................................................................................................................... 60
40 5.1 Expected throughput calculation ....................................................................................................................... 61
41 5.1.1 4G LTE ........................................................................................................................................................ 61
42 5.1.2 5G NR .......................................................................................................................................................... 62
43 5.2 Downlink peak throughput ................................................................................................................................ 64
44 5.2.1 Test description and applicability ................................................................................................................ 64
45 5.2.2 Test setup and configuration ........................................................................................................................ 64
46 5.2.3 Test Procedure ............................................................................................................................................. 65
47 5.2.4 Test requirements (expected results) ........................................................................................................... 65
48 5.3 Uplink peak throughput ..................................................................................................................................... 67
49 5.3.1 Test description and applicability ................................................................................................................ 67
50 5.3.2 Test setup and configuration ........................................................................................................................ 67
51 5.3.3 Test Procedure ............................................................................................................................................. 67

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 4
O-RAN.TIFG.E2E-Test.0-v04.00

1 5.3.4 Test requirements (expected results) ........................................................................................................... 68


2 5.4 Downlink throughput in different radio conditions ........................................................................................... 69
3 5.4.1 Test description and applicability ................................................................................................................ 69
4 5.4.2 Test setup and configuration ........................................................................................................................ 69
5 5.4.3 Test Procedure ............................................................................................................................................. 70
6 5.4.4 Test requirements (expected results) ........................................................................................................... 70
7 5.5 Uplink throughput in different radio conditions................................................................................................ 71
8 5.5.1 Test description and applicability ................................................................................................................ 71
9 5.5.2 Test setup and configuration ........................................................................................................................ 71
10 5.5.3 Test Procedure ............................................................................................................................................. 72
11 5.5.4 Test requirements (expected results) ........................................................................................................... 73
12 5.6 Bidirectional throughput in different radio conditions ...................................................................................... 74
13 5.6.1 Test description and applicability ................................................................................................................ 74
14 5.6.2 Test setup and configuration ........................................................................................................................ 74
15 5.6.3 Test Procedure ............................................................................................................................................. 74
16 5.6.4 Test requirements (expected results) ........................................................................................................... 75
17 5.7 Downlink coverage throughput (link budget) ................................................................................................... 77
18 5.7.1 Test description and applicability ................................................................................................................ 77
19 5.7.2 Test setup and configuration ........................................................................................................................ 77
20 5.7.3 Test Procedure ............................................................................................................................................. 78
21 5.7.4 Test requirements (expected results) ........................................................................................................... 78
22 5.8 Uplink coverage throughput (link budget) ........................................................................................................ 79
23 5.8.1 Test description and applicability ................................................................................................................ 79
24 5.8.2 Test setup and configuration ........................................................................................................................ 80
25 5.8.3 Test Procedure ............................................................................................................................................. 80
26 5.8.4 Test requirements (expected results) ........................................................................................................... 81
27 5.9 Downlink aggregated cell throughput (cell capacity) ....................................................................................... 82
28 5.9.1 Test description and applicability ................................................................................................................ 82
29 5.9.2 Test setup and configuration ........................................................................................................................ 82
30 5.9.3 Test Procedure ............................................................................................................................................. 84
31 5.9.4 Test requirements (expected results) ........................................................................................................... 85
32 5.10 Uplink aggregated cell throughput (cell capacity) ............................................................................................ 86
33 5.10.1 Test description and applicability ................................................................................................................ 86
34 5.10.2 Test setup and configuration ........................................................................................................................ 86
35 5.10.3 Test Procedure ............................................................................................................................................. 87
36 5.10.4 Test requirements (expected results) ........................................................................................................... 87
37 5.11 Impact of fronthaul latency on downlink peak throughout ............................................................................... 89
38 5.11.1 Test description and applicability ................................................................................................................ 89
39 5.11.2 Test setup and configuration ........................................................................................................................ 89
40 5.11.3 Test Procedure ............................................................................................................................................. 90
41 5.11.4 Test requirements (expected results) ........................................................................................................... 91
42 5.12 Impact of fronthaul latency on uplink peak throughout .................................................................................... 92
43 5.12.1 Test description and applicability ................................................................................................................ 92
44 5.12.2 Test setup and configuration ........................................................................................................................ 92
45 5.12.3 Test Procedure ............................................................................................................................................. 93
46 5.12.4 Test requirements (expected results) ........................................................................................................... 93
47 5.13 Impact of midhaul latency on downlink peak throughout ................................................................................. 95
48 5.13.1 Test description and applicability ................................................................................................................ 95
49 5.13.2 Test setup and configuration ........................................................................................................................ 95
50 5.13.3 Test Procedure ............................................................................................................................................. 96
51 5.13.4 Test requirements (expected results) ........................................................................................................... 97
52 5.14 Impact of midhaul latency on uplink peak throughout...................................................................................... 98

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 5
O-RAN.TIFG.E2E-Test.0-v04.00

1 5.14.1 Test description and applicability ................................................................................................................ 98


2 5.14.2 Test setup and configuration ........................................................................................................................ 98
3 5.14.3 Test Procedure ............................................................................................................................................. 98
4 5.14.4 Test requirements (expected results) ........................................................................................................... 99
5 Services ................................................................................................................................ 101
6 6.1 Data Services ................................................................................................................................................... 101
7 6.1.1 Web Browsing ........................................................................................................................................... 102
8 6.1.1.1 Test Description ................................................................................................................................... 102
9 6.1.1.2 Test Setup ............................................................................................................................................. 102
10 6.1.1.3 Test Methodology/Procedure ............................................................................................................... 103
11 6.1.1.4 Test Expectation (expected results) ..................................................................................................... 103
12 6.1.2 File upload/download ................................................................................................................................ 105
13 6.1.2.1 Test Description ................................................................................................................................... 105
14 6.1.2.2 Test Setup ............................................................................................................................................. 105
15 6.1.2.3 Test Methodology/Procedure ............................................................................................................... 106
16 6.1.2.4 Test Expectation (expected results) ..................................................................................................... 106
17 6.2 Video Streaming .............................................................................................................................................. 108
18 6.2.1 Video Streaming – Stationary Test ............................................................................................................ 110
19 6.2.1.1 Test Description ................................................................................................................................... 110
20 6.2.1.2 Test Setup ............................................................................................................................................. 110
21 6.2.1.3 Test Methodology/Procedure ............................................................................................................... 111
22 6.2.1.4 Test Expectation (expected results) ..................................................................................................... 111
23 6.2.2 Video Streaming – Handover between same Master eNB but different O-RUs – Intra O-DU................. 112
24 6.2.2.1 Test Description ................................................................................................................................... 112
25 6.2.2.2 Test Setup ............................................................................................................................................. 113
26 6.2.2.3 Test Methodology/Procedure ............................................................................................................... 113
27 6.2.2.4 Test Expectation (expected results) ..................................................................................................... 114
28 6.2.3 Video Streaming – Handover between same MeNB but different O-DUs – Inter O-DU Intra O-CU...... 115
29 6.2.3.1 Test Description ................................................................................................................................... 115
30 6.2.3.2 Test Setup ............................................................................................................................................. 116
31 6.2.3.3 Test Methodology/Procedure ............................................................................................................... 116
32 6.2.3.4 Test Expectation (expected results) ..................................................................................................... 117
33 6.2.4 Video Streaming – Handover between same MeNB but different O-CUs – Inter O-CU ......................... 118
34 6.2.4.1 Test Description ................................................................................................................................... 118
35 6.2.4.2 Test Setup ............................................................................................................................................. 118
36 6.2.4.3 Test Methodology/Procedure ............................................................................................................... 119
37 6.2.4.4 Test Expectation (expected results) ..................................................................................................... 119
38 6.2.5 Video Streaming – Handover between different MeNB while staying connected to same SgNB............ 121
39 6.2.5.1 Test Description ................................................................................................................................... 121
40 6.2.5.2 Test Setup ............................................................................................................................................. 121
41 6.2.5.3 Test Methodology/Procedure ............................................................................................................... 122
42 6.2.5.4 Test Expectation (expected results) ..................................................................................................... 122
43 6.3 Voice Services – Voice over LTE (VoLTE) ................................................................................................... 123
44 6.3.1 VoLTE Stationary Test .............................................................................................................................. 125
45 6.3.1.1 Test Description ................................................................................................................................... 125
46 6.3.1.2 Test Setup ............................................................................................................................................. 125
47 6.3.1.3 Test Methodology/Procedure ............................................................................................................... 126
48 6.3.1.4 Test Expectation (expected results) ..................................................................................................... 126
49 6.3.2 VoLTE Handover Test .............................................................................................................................. 128
50 6.3.3 Voice Service - LTE and NR handover tests ............................................................................................. 128
51 6.3.3.1 Test Description ................................................................................................................................... 128

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 6
O-RAN.TIFG.E2E-Test.0-v04.00

1 6.3.3.2 Test Setup ............................................................................................................................................. 128


2 6.3.3.3 Test Methodology/Procedure ............................................................................................................... 129
3 6.3.3.4 Test Expectation (expected results) ..................................................................................................... 129
4 6.4 Voice Service – EPS Fallback ......................................................................................................................... 131
5 6.4.1 EPS Fallback Test ...................................................................................................................................... 132
6 6.4.1.1 Test Description ................................................................................................................................... 132
7 6.4.1.2 Test Setup ............................................................................................................................................. 133
8 6.4.1.3 Test Methodology/Procedure ............................................................................................................... 134
9 6.4.1.4 Test Expectation (expected results) ..................................................................................................... 134
10 6.5 Voice Service – Voice over NR (VoNR) ........................................................................................................ 136
11 6.5.1 Voice over NR Test ................................................................................................................................... 137
12 6.5.1.1 Test Description ................................................................................................................................... 137
13 6.5.1.2 Test Setup ............................................................................................................................................. 137
14 6.5.1.3 Test Methodology/Procedure ............................................................................................................... 138
15 6.5.1.4 Test Expectation (expected results) ..................................................................................................... 138
16 6.5.2 VoNR – Intra-Distributed Unit (O-DU) handover .................................................................................... 140
17 6.5.2.1 Test Description ................................................................................................................................... 140
18 6.5.2.2 Test Setup ............................................................................................................................................. 140
19 6.5.2.3 Test Methodology/Procedure ............................................................................................................... 141
20 6.5.2.4 Test Expectation (expected results) ..................................................................................................... 141
21 6.5.3 VoNR – Intra-Central Unit (O-CU) Inter-Distributed Unit (O-DU) handover ......................................... 143
22 6.5.3.1 Test Description ................................................................................................................................... 143
23 6.5.3.2 Test Setup ............................................................................................................................................. 143
24 6.5.3.3 Test Methodology/Procedure ............................................................................................................... 144
25 6.5.3.4 Test Expectation (expected results) ..................................................................................................... 144
26 6.5.4 VoNR – Inter-Central Unit (O-CU) handover ........................................................................................... 145
27 6.5.4.1 Test Description ................................................................................................................................... 146
28 6.5.4.2 Test Setup ............................................................................................................................................. 146
29 6.5.4.3 Test Methodology/Procedure ............................................................................................................... 146
30 6.5.4.4 Test Expectation (expected results) ..................................................................................................... 147
31 6.6 Video Service – Video over LTE (ViLTE) ..................................................................................................... 148
32 6.6.1 ViLTE Stationary Test ............................................................................................................................... 150
33 6.6.1.1 Test Description ................................................................................................................................... 150
34 6.6.1.2 Test Setup ............................................................................................................................................. 150
35 6.6.1.3 Test Methodology/Procedure ............................................................................................................... 151
36 6.6.1.4 Test Expectation (expected results) ..................................................................................................... 151
37 6.6.2 ViLTE Handover Test ............................................................................................................................... 153
38 6.6.3 ViLTE - LTE to NR handover test ............................................................................................................ 153
39 6.6.3.1 Test Description ................................................................................................................................... 153
40 6.6.3.2 Test Setup ............................................................................................................................................. 153
41 6.6.3.3 Test Methodology/Procedure ............................................................................................................... 154
42 6.6.3.4 Test Expectation (expected results) ..................................................................................................... 155
43 6.7 Video Service – EPS Fallback......................................................................................................................... 156
44 6.7.1 Video Service – EPS Fallback testing ....................................................................................................... 158
45 6.7.1.1 Test Description ................................................................................................................................... 158
46 6.7.1.2 Test Setup ............................................................................................................................................. 158
47 6.7.1.3 Test Methodology/Procedure ............................................................................................................... 159
48 6.7.1.4 Test Expectation (expected results) ..................................................................................................... 159
49 6.8 Video Service – Video over NR ...................................................................................................................... 161
50 6.8.1 Video over NR – Stationary Testing ......................................................................................................... 162
51 6.8.1.1 Test Description ................................................................................................................................... 162
52 6.8.1.2 Test Setup ............................................................................................................................................. 163

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 7
O-RAN.TIFG.E2E-Test.0-v04.00

1 6.8.1.3 Test Methodology/Procedure ............................................................................................................... 163


2 6.8.1.4 Test Expectation (expected results) ..................................................................................................... 164
3 6.8.2 Video over NR – Intra-Distributed Unit (DU) handover........................................................................... 165
4 6.8.2.1 Test Description ................................................................................................................................... 165
5 6.8.2.2 Test Setup ............................................................................................................................................. 165
6 6.8.2.3 Test Methodology/Procedure ............................................................................................................... 166
7 6.8.2.4 Test Expectation (expected results) ..................................................................................................... 167
8 6.8.3 Video over NR – Intra-Central Unit (CU) Inter-Distributed Unit (DU) handover.................................... 168
9 6.8.3.1 Test Description ................................................................................................................................... 168
10 6.8.3.2 Test Setup ............................................................................................................................................. 168
11 6.8.3.3 Test Methodology/Procedure ............................................................................................................... 169
12 6.8.3.4 Test Expectation (expected results) ..................................................................................................... 169
13 6.8.4 Video over NR – Intra-Central Unit (CU) handover ................................................................................. 171
14 6.8.4.1 Test Description ................................................................................................................................... 171
15 6.8.4.2 Test Setup ............................................................................................................................................. 171
16 6.8.4.3 Test Methodology/Procedure ............................................................................................................... 172
17 6.8.4.4 Test Expectation (expected results) ..................................................................................................... 172
18 6.9 URLLC ............................................................................................................................................................ 174
19 6.9.1 Augmented Reality .................................................................................................................................... 175
20 6.9.1.1 Test Description ................................................................................................................................... 175
21 6.9.1.2 Test Setup ............................................................................................................................................. 175
22 6.9.1.3 Test Methodology/Procedure ............................................................................................................... 176
23 6.9.1.4 Test Expectation (expected results) ..................................................................................................... 176
24 6.10 mMTC ............................................................................................................................................................. 178
25 6.10.1 Sensors ....................................................................................................................................................... 179
26 6.10.1.1 Test Description ................................................................................................................................... 179
27 6.10.1.2 Test Setup ............................................................................................................................................. 180
28 6.10.1.3 Test Methodology/Procedure ............................................................................................................... 180
29 6.10.1.4 Test Expectation (expected results) ..................................................................................................... 181
30 Security ................................................................................................................................. 183
31 7.1 gNB Security Assurance Specification (SCAS) required by 3GPP SA3 ........................................................ 183
32 7.2 Optional: DoS, fuzzing and blind exploitation types of security test .............................................................. 186
33 7.2.1 S-Plane PTP DoS Attack (Network layer)................................................................................................. 186
34 7.2.1.1 Test description & applicability ........................................................................................................... 186
35 7.2.1.2 Test setup and configuration ................................................................................................................ 186
36 7.2.1.3 Test procedure ...................................................................................................................................... 187
37 7.2.1.4 Test requirements (expected results).................................................................................................... 187
38 7.2.2 C-Plane eCPRI DoS Attack (Network layer) ............................................................................................ 187
39 7.2.2.1 Test description & applicability ........................................................................................................... 187
40 7.2.2.2 Test setup and configuration ................................................................................................................ 188
41 7.2.2.3 Test procedure ...................................................................................................................................... 188
42 7.2.2.4 Test requirements (expected results).................................................................................................... 188
43 7.2.3 Near-RT RIC A1 Interface DoS Attack (Network layer) .......................................................................... 188
44 7.2.3.1 Test description & applicability ........................................................................................................... 188
45 7.2.3.2 Test setup and configuration ................................................................................................................ 189
46 7.2.3.3 Test procedure ...................................................................................................................................... 189
47 7.2.3.4 Test requirements (expected results).................................................................................................... 189
48 7.2.4 S-Plane PTP Unexpected Input (Network layer) ....................................................................................... 189
49 7.2.4.1 Test description & applicability ........................................................................................................... 189
50 7.2.4.2 Test setup and configuration ................................................................................................................ 190
51 7.2.4.3 Test procedure ...................................................................................................................................... 190

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 8
O-RAN.TIFG.E2E-Test.0-v04.00

1 7.2.4.4 Test requirements (expected results).................................................................................................... 190


2 7.2.5 C-Plane eCPRI Unexpected Input (Network layer)................................................................................... 190
3 7.2.5.1 Test description & applicability ........................................................................................................... 190
4 7.2.5.2 Test setup and configuration ................................................................................................................ 190
5 7.2.5.3 Test procedure ...................................................................................................................................... 191
6 7.2.5.4 Test requirements (expected results).................................................................................................... 191
7 7.2.6 Near-RT RIC A1 Interface Unexpected Input (Network layer) ................................................................ 191
8 7.2.6.1 Test description & applicability ........................................................................................................... 191
9 7.2.6.2 Test setup and configuration ................................................................................................................ 191
10 7.2.6.3 Test procedure ...................................................................................................................................... 192
11 7.2.6.4 Test requirements (expected results).................................................................................................... 192
12 7.2.7 Blind exploitation of well-known vulnerabilities over Near-RT RIC A1 interface (Network layer) ....... 192
13 7.2.7.1 Test description & applicability ........................................................................................................... 192
14 7.2.7.2 Test setup and configuration ................................................................................................................ 192
15 7.2.7.3 Test procedure ...................................................................................................................................... 193
16 7.2.7.4 Test requirements (expected results).................................................................................................... 193
17 7.3 O-Cloud resource exhaustion type of security test (Virtualization layer) ....................................................... 193
18 7.3.1 O-Cloud side-channel DoS attack ............................................................................................................. 194
19 7.3.1.1 Test description & applicability ........................................................................................................... 194
20 7.3.1.2 Test setup and configuration ................................................................................................................ 194
21 7.3.1.3 Test procedure ...................................................................................................................................... 195
22 7.3.1.4 Test requirements (expected results).................................................................................................... 195
23 Load and Stress Tests ........................................................................................................... 196
24 8.1 Simultaneous RRC_CONNECTED UEs ........................................................................................................ 196
25 8.1.1 Test description and applicability .............................................................................................................. 196
26 8.1.2 Test setup and configuration ...................................................................................................................... 197
27 8.1.3 Test Procedure ........................................................................................................................................... 197
28 8.1.4 Test requirements (expected results) ......................................................................................................... 198
29 8.2 Benchmark of UE State Transition ................................................................................................................. 198
30 8.2.1 Test description and applicability .............................................................................................................. 198
31 8.2.2 Test setup and configuration ...................................................................................................................... 199
32 8.2.3 Test Procedure ........................................................................................................................................... 200
33 8.2.4 Test requirements (expected results) ......................................................................................................... 200
34 8.3 Traffic Load Testing ........................................................................................................................................ 201
35 8.3.1 Test description and applicability .............................................................................................................. 201
36 8.3.2 Test setup and configuration ...................................................................................................................... 201
37 8.3.3 Test Procedure ........................................................................................................................................... 201
38 8.3.4 Test requirements (expected results) ......................................................................................................... 202
39 8.4 Traffic Model Testing ..................................................................................................................................... 203
40 8.4.1 Test description and applicability .............................................................................................................. 203
41 8.4.1.1 Traffic model........................................................................................................................................ 203
42 8.4.2 Test setup and configuration ...................................................................................................................... 205
43 8.4.3 Test Procedure ........................................................................................................................................... 205
44 8.4.4 Test requirements (expected results) ......................................................................................................... 206
45 8.5 Long hours stability Testing ............................................................................................................................ 207
46 8.5.1 Test description and applicability .............................................................................................................. 207
47 8.5.2 Test setup and configuration ...................................................................................................................... 207
48 8.5.3 Test Procedure ........................................................................................................................................... 207
49 8.5.4 Test requirements (expected results) ......................................................................................................... 208
50 8.6 Multi-cell Testing ............................................................................................................................................ 208
51 8.6.1 Test description and applicability .............................................................................................................. 208

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 9
O-RAN.TIFG.E2E-Test.0-v04.00

1 8.6.2 Test setup and configuration ...................................................................................................................... 209


2 8.6.3 Test Procedure ........................................................................................................................................... 209
3 8.6.4 Test requirements (expected results) ......................................................................................................... 210
4 8.7 Emergency call ................................................................................................................................................ 210
5 8.7.1 Test description and applicability .............................................................................................................. 210
6 8.7.2 Test setup and configuration ...................................................................................................................... 210
7 8.7.3 Test Procedure ........................................................................................................................................... 211
8 8.7.4 Test requirements (expected results) ......................................................................................................... 211
9 8.8 ETWS (Earthquake and Tsunami Warning System) ....................................................................................... 212
10 8.8.1 Test description and applicability .............................................................................................................. 212
11 8.8.2 Test setup and configuration ...................................................................................................................... 212
12 8.8.3 Test Procedure ........................................................................................................................................... 212
13 8.8.4 Test requirements (expected results) ......................................................................................................... 213
14 RIC-Enabled End-to-end Use Case Test .............................................................................. 214
15 9.1 Test Setup and Methodology ........................................................................................................................... 214
16 9.2 Traffic Steering Use Case Test ........................................................................................................................ 215
17 9.2.1 Test Configuration ..................................................................................................................................... 216
18 9.1.1.1 System Level Configuration ................................................................................................................ 216
19 9.1.1.2 Cell Level Configuration ..................................................................................................................... 216
20 9.2.1.1 UE Level Configuration ....................................................................................................................... 217
21 9.2.1.2 Expected Result and KPIs .................................................................................................................... 218
22 9.2.2 RIC-TS1 - Traffic Steering (Connected Mode Mobility) Test Case ......................................................... 219
23 9.2.2.1 Test Description and Applicability ...................................................................................................... 219
24 9.2.2.2 System Level Configuration ................................................................................................................ 220
25 9.1.1.3 Cell Level Configuration ..................................................................................................................... 220
26 9.2.2.3 UE Level Configuration ....................................................................................................................... 221
27 9.2.2.4 Initial Condition ................................................................................................................................... 222
28 9.2.2.5 Test Procedure...................................................................................................................................... 223
29 9.2.2.6 Expected Result.................................................................................................................................... 223
30 Annex A: Template of test report................................................................................................................... 225
31

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 10
O-RAN.TIFG.E2E-Test.0-v04.00

1 Introductory Material
2 1.1 Scope of the document
3 The present document has been produced by the O-RAN ALLIANCE (O-RAN).

4 The present document provides description of the End-to-end (E2E) System Tests.

5 1.2 References
6 The following documents contain provisions which, through reference in this text, constitute provisions of the present
7 document.
8 - References are either specific (identified by date of publication, edition number, version number, etc.) or
9 non-specific.
10 - For a specific reference, subsequent revisions do not apply.
11 - For a non-specific reference, the latest version applies. In the case of a reference to a 3GPP document (including
12 a GSM document), a non-specific reference implicitly refers to the latest version of that document in Release 15.
13 [1] 3GPP TR 21.905: “Vocabulary for 3GPP Specifications”
14 [2] 3GPP TR 36.104, “Evolved Universal Terrestrial Radio Access (E-UTRA); Base Station (BS) radio
15 transmission and reception”
16 [3] NGMN Alliance, “Definition of the testing framework for the NGMN 5G pre-commercial networks trials”,
17 White paper July 2019, version 3.0. Available: http://www.ngmn.org
18 [4] NGM Alliance, “NGMN 5G pre-commercial networks trials - major conclusions”, White paper, December
19 2019, version 1.0. Available: http://www.ngmn.org
20 [5] 3GPP TR 38.913: “Study on new radio access technology Radio interface protocol aspects”, March 2017.
21 [6] IETF RFC7323, “TCP Extensions for High Performance”, September 2014
22 [7] 3GPP TS 38.215, “Physical layer measurements”, September 2020
23 [8] 3GPP TS 36.133, “Requirements for support of radio resource management”
24 [9] 3GPP TS 38.133, “Requirements for support of radio resource management”
25 [10] O-RAN ALLIANCE, “O-RAN Architecture Description v2.0”, July 2020
26 [11] O-RAN ALLIANCE, “O-RAN End-to-End System Testing Framework Specification 1.0”, July 2020
27 [12] O-RAN ALLIANCE, “O-RAN Fronthaul Interoperability Test Specification (IOT) 2.0”, April 2020
28 [13] ITU-R M.2410-0, “Minimum requirements related to technical performance for IMT-2020 radio
29 interface(s)”, November 2017.
30 [14] 3GPP TR 36.814, “Further advancements for E-UTRA physical layer aspects”, March 2017
31 [15] 3GPP TS 38.306, "User Equipment (UE) radio access capabilities", December 2020
32 [16] O-RAN ALLIANCE, “O-RAN Fronthaul Control, User and Synchronization Plane Specification v5.0”,
33 November 2020
34 [17] 3GPP TS 38.300, “NR and NG-RAN Overall Description; Stage 2”, December 2020
35 [18] 3GPP TS 38.101-1, “User Equipment (UE) radio transmission and reception; Part 1: Range 1 Standalone”,
36 December 2020

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 11
O-RAN.TIFG.E2E-Test.0-v04.00

1 [19] 3GPP TS 38.101-2, “User Equipment (UE) radio transmission and reception; Part 2: Range 2 Standalone”,
2 December 2020
3 [20] 3GPP TS 38.211, “Physical channels and modulation”, December 2020
4 [21] 3GPP TS 38.213, “Physical layer procedures for control”, December 2020
5 [22] 3GPP TS 38.214, “Physical layer procedures for data”, December 2020
6 [23] 3GPP TR 38.308, “Physical Layer Aspects”, September 2017
7 [24] 3GPP TSG RAN WG1 Meeting #92 R1-1801352, “Discussion on NR UE peak data rate”, March 2018
8 [25] 3GPP TS 36.211, “Physical channels and modulation”, September 2020
9 [26] 3GPP TR 38.801, “Radio access architecture and interfaces”, March 2017
10 [27] 3GPP TS 33.511: “Security Assurance Specification (SCAS) for the next generation Node B (gNodeB)
11 network product class” (Release 16), September 2020
12 [28] 3GPP TS 23.502, “Procedures for the 5G System; Stage-2”, December 2020
13 [29] 3GPP TS 23.401 “General Packet Radio Service (GPRS)enhancements for Evolved Universal Terrestrial
14 Radio Access Network (E-UTRAN) access”
15 [30] 3GPP TS 37.340 “Overall description; Stage-2”
16 [31] 3GPP TS 38.401 “5G; NG-RAN; Architecture description
17 [32] 3GPP TS 38.473 “5G; NG-RAN; F1 Application Protocol (F1AP)”
18 [33] 3GPP TS 37.470 “W1 interface General aspects and principles”, July 2020
19 [34] NGM Alliance, “Vertical URLLC Use Cases and Requirements”, February 2020, version 2.5.4. Available:
20 http://www.ngmn.org
21 [35] 3GPP TS 28.552: “Management and orchestration; 5G performance measurements”, December 2020
22 [36] O-RAN ALLIANCE, “O-RAN Working Group 1 Use Cases Detailed Specification v07.00,” November
23 2021
24 [37] O-RAN ALLIANCE, “O-RAN Working Group 2 Non-RT RIC & A1 Interface: Use Cases and
25 Requirements v05.00,” November 2021
26 [38] O-RAN ALLIANCE, “O-RAN Working Group 3 Use Cases and Requirements v01.00”, July 2021
27 [39] 3GPP TS 23.501: “System architecture for the 5G System (5GS)”, December 2020
28 [40] 3GPP TS 38.306: “User Equipment (UE) radio access capabilities”, December 2021
29 [41] 3GPP TS 23.501: “System architecture for the 5G System (5GS)”, December 2020
30 [42] O-RAN ALLIANCE, “Certification and Badging Processes and Procedures”, April 2022
31 [43] O-RAN ALLIANCE, “Certification and Badging Test Report Templates”, April 2022

32 1.3 Definitions and Abbreviations

33 1.3.1 Definitions

34 For the purposes of the present document, the terms and definitions given in 3GPP TR 21.905 [1] and the following
35 apply. A term defined in the present document takes precedence over the definition of the same term, if any, in 3GPP
36 TR 21.905 [1].

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 12
O-RAN.TIFG.E2E-Test.0-v04.00

1 C-Plane Control Plane: refers specifically to real-time control between O-DU and O-RU, and should not be
2 confused with the UE’s control plane
3 NSA Non-Stand-Alone network mode that supports operation of SgNB attached to MeNB
4 O-CU O-RAN Central Unit – a logical node hosting PDCP, RRC, SDAP and other control functions.
5 This can be considered short-hand for the O-CU-CP and O-CU-UP in an O-RAN system
6 O-DU O-RAN Distributed Unit: a logical node hosting RLC/MAC/High-PHY layers based on a lower
7 layer functional split. O-DU in addition hosts an M-Plane instance.
8 O-RU O-RAN Radio Unit: a logical node hosting Low-PHY layer and RF processing based on a lower
9 layer functional split. This is similar to 3GPP’s “TRP” or “RRH” but more specific in including
10 the Low-PHY layer (FFT/iFFT, PRACH extraction). O-RU in addition hosts M-Plane instance.
11 PTP Precision Time Protocol (PTP) is a protocol for distributing precise time and frequency over
12 packet networks. PTP is defined in the IEEE Standard 1588.
13 PDCCH Physical Downlink Control Channel applies for LTE and NR air interface
14 PBCH Physical Broadcast Channel applies for LTE and NR air interface
15 SA Stand-Alone network mode that supports operation of gNB attached to a 5G Core Network
16 SCS OFDM Sub Carrier Spacing
17 SSB Synchronization Signal Block, in 5G PBCH and synchronization signal are packaged as a single
18 block
19 S-Plane Synchronization Plane: Data flow for synchronization and timing information between nodes
20 U-Plane User Plane: refers to IQ sample data transferred between O-DU and O-RU
21 Non-RT RIC Non-real-time RAN Intelligent Controller: a logical function that enables non-real-time control
22 and optimization of RAN elements and resources, AI/ML workflow including model training and
23 updates, and policy-based guidance of applications/features in Near-RT RIC.
24 Near-RT RIC Near-real-time RAN Intelligent Controller: a logical function that enables near-real-time control
25 and optimization of RAN elements and resources via fine-grained (e.g. UE basis, Cell basis) data
26 collection and actions over E2 interface.

27 1.3.2 Abbreviations

28 For the purposes of the present document, the abbreviations given in 3GPP TR 21.905 [1] and the following apply. An
29 abbreviation defined in the present document takes precedence over the definition of the same abbreviation, if any, in
30 3GPP TR 21.905 [1].
31 CFI Control format indicator
32 DL Downlink: data flows from the core network towards the UE
33 DoS Denial of service
34 DUT Device under Test
35 E2E End-to-End
36 eNB Evolved Node B (LTE base station)
37 FFS For further study
38 gNB Next-generation Node B (5G NR base station)
39 IOT Interoperability Testing

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 13
O-RAN.TIFG.E2E-Test.0-v04.00

1 IUT Interface under Test


2 KPI Key performance indicator
3 MCS Modulation and coding scheme
4 OTA Over the air
5 PDSCH Physical downlink shared channel
6 PUSCH Physical uplink shared channel
7 PRB Physical resource block (12 x Resource Elements per PRB)
8 QUIC Quick UDP Internet Connections
9 SUT System under test
10 TCP Transmission Control Protocol: connection-oriented IP protocol
11 TIFG O-RAN Test and Integration Focus Group
12 UDP User Datagram Protocol: connectionless IP protocol
13 UE User Equipment: terminology for a mobile device/terminal in LTE and NR
14 UL Uplink: data flows from the UE towards the core network
15 VoLTE Voice over LTE
16 VoNR Voice over NR
17 ViLTE Video over LTE

18 1.4 Revision Guideline


19 The contents of the present document are subject to continuing work within O-RAN and may change following formal
20 O-RAN approval. Should the O-RAN modify the contents of the present document, it will be re-released by O-RAN with
21 an identifying change of release date and an increase in version number as follows:

22 Release x.y.z
23 Where:
24 x the first digit is incremented for all changes of substance, i.e. technical enhancements, corrections, updates,
25 etc. (the initial approved document will have x=01).
26 y the second digit is incremented when editorial only changes have been incorporated in the document.
27 z the third digit included only in working versions of the document indicating incremental changes during the
28 editing process.
29

30
31

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 14
O-RAN.TIFG.E2E-Test.0-v04.00

1 Overview
2 This test specification is focused on validating the end-to-end system functionality, performance, and key features of the
3 O-RAN system as a black box. It is based on the principles outlined in the end-to-end system test framework document
4 [11].

5 Chapter 3 specifies the test methodology, tools, and baseline configurations, which apply to the entire specification,
6 unless otherwise indicated.

7 Chapter 4 specifies the baseline functional testing of the radio access network from an end-to-end perspective.

8 Chapter 5 specifies the performance testing of the radio access network from an end-to-end perspective, with a focus
9 on end-user experienced key performance indicators (KPIs)

10 Chapter 6 specifies the test cases and KPIs around the application level services of the radio access network, including
11 data, voice and video as experienced by an end-user

12 Chapter 7 specifies the test cases around end-to-end radio access network security

13 Chapter 8 specifies load and stress test cases which assess the stability and robustness of the gNB/eNB under loaded
14 scenarios

15 Future versions of this specification may add additional chapters for test cases on timing and synchronization,
16 operations, administration, and maintenance (OAM), and other cases of relevance from an end-to-end system
17 perspective.

18

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 15
O-RAN.TIFG.E2E-Test.0-v04.00

1 Testing methodology and configuration


2 This chapter describes the common testing methods and configurations which will be used in the subsequent chapters. To
3 ensure fair and comparable test results among various test campaigns, consistent test setups should be utilized. The test
4 conditions should reflect the realistic operational environment as much as possible to ensure meaningful and as close to
5 real-world results as possible. This end-to-end test specification harmonizes the test conditions, methodologies, and
6 procedures, but the test configuration (parameters) of DUT/SUT is not specified in this document. However, it is required
7 to record the complete test configuration used in the test report to enable the test to be reproduced if needed, and for the
8 test results to potentially be used for other purposes, e.g. benchmarking or comparison.

9 There are several design areas in RAN, where the vendors can differentiate such as RF performance (e.g. receiver
10 sensitivity, PA design), radio link adaptation algorithms (e.g. radio channel estimation, MCS selection, MIMO mode
11 selection, transmission mode selection, UL power control), scheduling and overhead management (e.g. number of control
12 channels). These different approaches result in competitive advantages leading to differences in end-to-end performance
13 which can be assessed with the tests defined in this document. Hence, it is not possible to set pass/fail criteria for all the
14 tests, but the pass/fail criteria are set whenever possible, e.g. in the functional tests in Chapter 4. The expected performance
15 values are also indicated for reference network configurations.

16 Unless otherwise stated in this document, the tests are suitable for both laboratory as well as field environments. All
17 laboratory tests should be conducted over a cable, or in case of OTA tests, inside a shielded box/room, to ensure
18 repeatability. In the laboratory, radio signal strength (i.e. attenuation) on the 5G NR path and/or 4G LTE path can be
19 modified by using variable attenuators. The end-user device, if used, should be placed inside a shielded box to avoid
20 interference from external signals. The laboratory environment should allow for stable and repeatable testing conditions,
21 and it is more suitable for benchmarking. On the other hand, the field environment allows for the evaluation of complex
22 scenarios with realistic radio channel variations and behavior of the network (e.g. inter-cell interference and handovers).
23 It is assumed that the field tests will be performed over the air (OTA). In the field, radio signal strength is modified by
24 placing the UE in different positions inside the cell.

25 Unless otherwise stated in this document, the same operating system (e.g. Windows 10) with default settings and
26 configuration should be utilized for both ends (i.e. the host applications at end-user device (test UE) and application
27 (traffic) server) in order to ensure a consistent test environment.

28 Unless otherwise stated in this document, the tests are suitable for both TDD and FDD.

29 Unless otherwise stated in this document, the following network architectures [26] depicted in Figure 3-1 below are
30 addressed and supported:

31 • 4G LTE – Option 1

32 • 5G NR standalone (5G SA) - Option 2

33 • 5G NR non-standalone (5G NSA) - Option 3 / Option 3a / Option 3x

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 16
O-RAN.TIFG.E2E-Test.0-v04.00

EPC 5GC

4G LTE 5G NR

4G LTE – Option 1 5G SA – Option 2

EPC EPC EPC

4G LTE 5G NR 4G LTE 5G NR 4G LTE 5G NR

5G NSA – Option 3 5G NSA – Option 3a 5G NSA – Option 3x

1 Figure 3-1 : The network architectures supported in this document - red dashed lines indicate control plane
2 while blue solid lines indicate user plane.

3 3.1 System under test


4 The whole O-RAN system is the System under Test (SUT) and can be viewed as an integrated black box in the context
5 of E2E testing [11], i.e. the internal functionality and architecture of SUT is out of scope. It is expected that all involved
6 O-RAN functions and interfaces can properly interoperate together, and an end-to-end communication link can be
7 established between the end-user device and the application server or another end-user device. The testing of
8 interoperability and conformance of the internal functions of the SUT is out of scope for this document. The SUT is
9 expected to be in service mode and run in their normal operation state. The E2E KPIs are defined across the whole end-
10 to-end communication link between the end-user device and the application (traffic) server or another end-user device –
11 see Figure 3-2.

End to end (E2E) KPIs

Uu S1, NG 3GPP Other


UE O-RAN System 3GPP Core
Services Services

System under Test (SUT)


12

13 Figure 3-2: The O-RAN system as System under Test (SUT) and E2E KPIs

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 17
O-RAN.TIFG.E2E-Test.0-v04.00

1 The end-to-end communication link includes O-RAN as well as non-O-RAN (e.g. core, end-user device) components
2 which could negatively affect the end-to-end performance, e.g. limited capability of the test UEs and/or the application
3 servers to generate/receive enough data traffic, or a bottleneck in the transport network. All these unwanted contributions
4 should be avoided or at least minimized in order to measure unbiased KPIs. In addition, there can be a performance
5 difference between different vendors and chipsets depending on their level of maturity. Commercial (production grade)
6 devices (e.g. test UE) are preferred whenever possible, ensuring the tests are sufficiently documented, stable and
7 repeatable.

8 All O-RAN components [10] (such as O-CU-CP and O-CU-UP, O-DU, O-RU) and interfaces [10] (such as Open
9 Fronthaul, X2) included in the System under Test are recommended to have been tested against their respective
10 conformance and interoperability O-RAN specifications if they are available.

11 3.1.1 Load and Stress Interfaces to SUT

12 A key priority of load and stress testing is to exercise the performance of the SUT near or exceeding full capacity. The
13 capacity of the O-DU and O-CU within the SUT may be such that applying all traffic necessary to reach full capacity
14 may require many O-RUs which may not always be feasible. Figure 3-3 defines additional traffic insertion points where
15 traffic may be applied directly at the Open Fronthaul or F1 Interfaces to the SUT in order to provide additional background
16 UE stack traffic along with application of test traffic at the Uu interface to O-RU(s) in the SUT. This should allow for
17 adequate stress on the complete SUT for execution of the E2E test scenarios.

18

19 End to end (E2E) KPIs

20

21 O-RAN System (SUT)


Uu O-RU #1 O-DU #1 O-CU S1, NG 3GPP Other
UE RF OFH OFH F1 3GPP Core
22 F1 S1/ Services Services
OFH
NG
23 O-RU #n O-DU #m
RF OFH OFH F1 F1
OFH
24 F1

25 UE+O-RU OFH
(Optional)
26
UE+O-DU F1
27
(Optional)
28 Figure 3-3: Interfaces for applying incremental traffic to System Under Test for Load and Stress scenarios

29 3.2 Test and measurement equipment and tools


30 All the tests shall be performed in a non-intrusive manner; that is, in a manner in which the SUT is not required to support
31 any functionality or mode of operation beyond that required for normal operation in a production network. The SUT is
32 not expected to be used as test tools when deployed in a production network, and therefore it should not be used as test
33 tools during end-to-end testing.

34 All the measurement equipment and tools used in the tests must be properly calibrated and configured in advance in order
35 to minimize the influence of the test equipment on the measurements results. The parameters (e.g. attenuation) of cables,
36 attenuators, splitters, combiners, etc. must be also measured in advance and compensated for in the final measurement
37 results.

38 Table 3-1 Test and measurement equipment and tools

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 18
O-RAN.TIFG.E2E-Test.0-v04.00

Test tool Description


Real UE and/or The UE (Real UE or UE emulator) is used to establish stateful end-to-end connection and to
UE emulator generate or receive data traffic.

The real UE used in this context as a test tool is typically a UE which is designed for commercial
or testing applications with certain test and diagnostic functions enabled for test and measurement
purposes. Such test and diagnostic functions should not affect the performance.

The real UE requires a SIM card (real or emulated) which is pre-provisioned with subscriber
profiles. A UE emulator or multiple real UEs can be used in multi-UE test scenarios requiring
multiple UEs sessions. The UE is connected with the SUT either via RF cables or via an over the
air (OTA) connection. In a lab environment, the UE should be placed inside an RF shielded
box/room in order to avoid interference from external signals.

The logging tool connected to the UE is used to capture measurements and KPI logs for test
validation and reporting.

UE + O-RU The UE + O-RU emulator is used to establish stateful connections and to generate or receive data
emulator traffic directly at the Open fronthaul connection to an O-DU within the O-RAN SUT.

The UE + O-RU emulator is typically designed for testing applications with certain test and
diagnostic functions enabled for test and measurement purposes

The UE + O-RU emulator is used to create background traffic driven by multiple stateful UE
stacks for purposes of increasing the load on the O-DU in addition to the E2E traffic applied at
the Uu interface to the SUT

UE + O-DU The UE + O-DU emulator is used to establish stateful connections with, and to generate or receive
emulator data and signaling traffic directly at the F1 connection to the O-CU within the O-RAN SUT.

The UE + O-DU emulator is typically designed for testing applications with certain test and
diagnostic functions enabled for test and measurement purposes.

The UE + O-DU emulator is used to create background traffic driven by multiple UE connections
for purposes of increasing the load on the O-CU in addition to the E2E traffic applied at the Uu
interface to the SUT

4G/5G Core or The 4G/5G core or core emulator is used to terminate 4G/5G NAS sessions, and to support core
Core emulator network procedures required for RAN (SUT) testing. 4G/5G core or core emulator must support
end-to-end connection and data transfer between Application server and Real UE/UE emulator.

IMS Core or IMS The IMS Core or IMS core emulator is used to support voice and video calling services like
Core emulator VoLTE, ViLTE, VoNR, Video over NR and EPS Fallback using protocols like SIP and RTP.
IMS core or IMS core emulator should interface with the 4G/5G core to setup dedicated
bearers/QoS Flows to support voice and video calling services.

Application The application (traffic) server is used as an endpoint for generation and/or termination of various
(traffic) server data traffic streams to/from Real UE(s)/UE emulator. The application server should be capable to
generate data traffic for the services under test.

The application server should be placed as close as possible to the core/core emulator, and
connected to the core/core emulator via a transport link with sufficient capacity.

Protocol analyzer The protocol analyzer is used for test results verification, and for troubleshooting and root cause
analysis of failed tests. Note that if IPsec encryption is applied at the network interface, then it
would not be possible to use the protocol analyzer without decryption of IPsec.

Network The network impairment emulator is used for tests which require insertion of impairment (packet
impairment delay and/or jitter) at the network interface (e.g. Open fronthaul).
emulator

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 19
O-RAN.TIFG.E2E-Test.0-v04.00

RF attenuators RF attenuators are used for tests which require radio signal attenuation. Fading generators can be
and/or Fading used to simulate specific radio channel conditions (e.g. Urban, Rural, High Speed Train).
generator

RF shielded The RF shielded box/room is used for over the air (OTA) connectivity between the UE and SUT
box/room in the lab environment. The RF shielded box/room should support reliable MIMO testing, if
MIMO is required.

Packet generation The packet generation tool / Denial of Service (DoS) emulator is used for DoS traffic generation
tool / DoS of security tests. The tool must support crafting network traffic on network layers from 2 to 7,
emulator which conform to network protocols such as: Ethernet, IP, UDP, TCP, PTP, eCPRI, TLS,
HTTP/HTTPS. The tool is intended to be deployable in various network segments
(communication planes) according to the testing needs.

Packet capture The packet capture tool is used to capture samples of data traffic for validation, analysis, and
tool troubleshooting. In the case of security test cases it can be used to capture samples of legitimate
traffic, which then can be used as templates for fuzzing attacks. The tool must support capturing
network traffic on network layers from 2 to 7, which conform to network protocols such as:
Ethernet, IP, UDP, TCP, PTP, eCPRI, TLS, QUIC, HTTP/HTTPS. The tool is intended to be
deployable in various network segments (communication planes) according to the testing needs.

Network tap A network tap is a hardware or software device which provides access and visibility to the data
flowing across a computer network.

Fuzzing tool The protocol fuzzing tool is used for unexpected protocol input generation of security tests. The
tool must support mutating and replaying of captured network traffic on network layers from 2 to
7, which conform to network protocols such as: Ethernet, IP, UDP, TCP, PTP, eCPRI, TLS,
HTTP/HTTPS. The tool is intended to be deployable in various network segments
(communication planes) according to the testing needs.

Vulnerability The vulnerability scanning tool is used for blind exploitation of well-known vulnerabilities during
scanning tool security tests. The tool should rely on cyclically updated database of known vulnerabilities based
on Common Vulnerabilities and Exposures (CVE) and should support scanning network services
running on TCP/IP stack of protocols. The tool is intended to be deployable in various network
segments (communication planes) according to the testing needs.

NFV The Network Function Virtualization (NFV) tool is used for O-Cloud system performance
benchmarking measurement and resource exhaustion type of DoS attack generation. This tool should be able to
and resource be deployed on any types of O-Cloud environment (public or private) with testing VNF(s) and/or
exhaustion tool CNF(s) support.

1 3.3 Test report


2 Tests should be described in the test report with sufficient detail to allow the tests to be reproducible by different parties
3 and to enable benchmarking and comparison. The unified reporting of test results is important for benchmarking and
4 comparison of results. The following common minimum set of configuration parameters and information about the
5 test environment needs to be reported in each test report [3]:

6 • Carrier frequency

7 • Total transmission (effective) bandwidth and number of total RBs

8 • Duplex mode (e.g. FDD, TDD)

9 • Sub-carrier spacing

10 • Carrier prefix length

11 • Slot length

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 20
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Number of supported MIMO layers at both eNB/gNB and UE sides

2 • Antenna configuration (number of Tx/Rx antenna elements, e.g. 4T4R) at both eNB/gNB and UE sides

3 • Transmit power at antenna connectors at both eNB/gNB and UE sides

4 • Antenna gains at both eNB/gNB and UE sides

5 • DL/UL ratio (configuration) in case of TDD duplex mode

6 • Test UE position with respect to O-RU antenna (e.g. LOS/NLOS/nLOS) in case of field test

7 • Deployment scenario (e.g. indoor hotspot, macro/micro, dense urban, urban, rural) in case of field test

8 • List of utilized test and measurement equipment and tools (incl. logging tools, test UE(s)) including the type
9 and version. The Operating Systems (incl. the version) used at end-user device and application server should
10 be noted as well. If TCP performance has been measured, the setting of TCP configuration parameters should
11 be also noted.

12 • Information about the SUTs (e.g. O-RU, O-DU, O-CU-CP, O-CU-UP) including the type, parameters,
13 configuration, SW and HW versions, Interface profiles (e.g. Open fronthaul IOT profile [12]).

14 The template for a complete test report for non-O-RAN Badge testing can be found in Annex A. Any E2E testing for the
15 purpose of seeking an O-RAN Badge must comply with the procedures defined in [42] and the report templates defined
16 in [43]. Photos should also be taken as part of the test report in order to illustrate the test environment. Additional
17 parameters and counters are specified in the description of each test in the subsequent chapters.

18 3.4 Data traffic


19 Full buffer and finite buffer traffic models are utilized in the tests.

20 The full buffer traffic model is characterized by a constant number of users in the cell during the test, wherein the buffers
21 of the users' data flows always have unlimited amount of data to transmit. The model is preferred due to its simplicity.

22 In the Finite buffer traffic model, a user is assigned a finite payload to transmit or receive when it arrives. The user
23 arrival process of this model captures the fact that the users in the network are not simultaneously active at the same time,
24 but they rather become active when they start a data session that require the download/upload of data. Examples of models
25 are FTP traffic model 1/2/3 [14] largely used in 3GPP simulations.

26 Data throughput can be measured at different protocol layers. Each network protocol layer adds extra overhead (header
27 information), thereby reducing the data throughput available to the layer above. The highest throughput is provided at
28 the physical layer including user data and the overhead from higher protocol layers. The data throughput at the RLC
29 (Radio Link Control) layer is independent from radio-specific overhead and therefore well-suited for comparison with
30 other access technologies. The application layer throughput is the net throughput seen by user applications operating on
31 top of either UDP, TCP or QUIC – for example, the typical FTP overhead is around 3-5%, typical HTTP overhead is
32 around 30% compared to RLC throughput. Unless otherwise stated in this document, the reported throughput (user data
33 rate) shall consider all the overhead (control channels, reference signals).

34 UDP (User Datagram Protocol), TCP (Transmission Control Protocol) and QUIC (Quick UDP Internet Connections) are
35 typical transport layer protocols utilized in the tests.

36 UDP is a simple, connection-less transport layer protocol which does not guarantee error-checking and recovery. UDP
37 throughput is more suitable for benchmarking as it is not affected by the system configuration parameters. UDP is also
38 faster, lighter (less overhead) and more efficient than TCP.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 21
O-RAN.TIFG.E2E-Test.0-v04.00

1 TCP is reliable, connection-oriented transport layer protocol which includes error-checking and recovery, and guarantees
2 data delivery with preserved order of data packets. The performance of a TCP connection can be impacted by various
3 factors such as end-to-end latency, number of retransmissions, packet loss, and TCP configuration parameters such as
4 window size, window scale, timestamps, etc. [6]. The default values of TCP configuration parameters can also vary in
5 different Operation Systems (incl. different versions of the same OS) which are used at the end-user device (test UE) and
6 application (traffic) server. It is recommended to use the same OS (incl. the version) with default setting at the end-user
7 device (test UE) and application server. Since the settings and behavior of TCP connections cannot be easily unified and
8 normalized, the measurement of TCP performance is recommended only as an illustrative indicator, and UDP
9 performance should be used for the benchmarking and assessment of system performance.

10 QUIC (Quick UDP Internet Connections) is a general-purpose transport layer protocol built on top of UDP to support
11 the next generation of application layer protocols. QUIC provides features like connection establishment, congestion
12 control, stream multiplexing and forward error correction to provide a secure and reliable connection-oriented protocol
13 over UDP. QUIC is being used as the standard transport mechanism for HTTP/3.

14 In addition, the following application layer protocols are utilized in the tests.

15 Hypertext Transfer Protocol (HTTP) is the application layer protocol used in the internet. HTTP is a stateless protocol
16 which follows the request-response model between client and the server. The client places a request for a resource to the
17 server, and the server responds back to the client with requested resource and/or the appropriate response code. HTTP’s
18 support for headers between the client-server makes this protocol simple, extensible, and powerful.

19 Session Initiation Protocol (SIP) is an application layer signalling protocol for real-time sessions like IP telephony. This
20 is a text-based protocol which allows negotiation between two end points to initiate a session, maintain the session and
21 terminate the session. SIP is the default signalling protocol used in the telecom network for VoLTE, ViLTE, VoNR and
22 Video over NR.

23 Real-time Transport Protocol (RTP) is an application layer protocol used to transmit real-time data such as audio and
24 video over IP network. RTP is the default data plane protocol used in the telecom network for services like VoLTE,
25 ViLTE, VoNR and Video over NR. RTP does not guarantee Quality of Service but works in conjunction with Real Time
26 Control Protocol (RTCP) to detect and convey packet loss and jitter information.

27 File Transfer Protocol (FTP) is an application layer protocol to transfer files on a computer network. FTP follows a
28 client-server model where the client can upload the file to the server, or download he file from the server. FTP protocol
29 uses two separate connections between the client and the server – one for control and the other one for data or transfer of
30 file. FTP along with multiple variants of the protocol have become the de-facto standards to transfer file on the internet.

31 3.5 Mobility Classes


32 The following classes of mobility are defined [13]:

33 • Stationary: 0 kph

34 • Pedestrian: 0 kph to 10 kph (typical value 4 kph)

35 • Vehicular: 10 kph to 120 kph (typical value 50 kph)

36 • High speed vehicular: 120 kph to 500 kph (typical value 150 kph)

37 High speed vehicular speeds close to 500 kph are mainly used for high speed trains.

38 3.6 Radio conditions


39 The radio signal quality is described by the radio parameters such as RSRP and SINR. These radio parameters are defined
40 differently in LTE and 5G NR.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 22
O-RAN.TIFG.E2E-Test.0-v04.00

1 • LTE RSRP (Reference Signal Received Power) [7] is defined as the linear average over the power contributions
2 (in [W]) of the resource elements that carry cell-specific reference signals (CRS) within the considered
3 measurement frequency bandwidth. The RSRP is reported from UE back to eNB. The reporting range of RSRP
4 is defined from -140dBm to -44dBm [8].

5 • LTE SINR (Signal to Noise and Interference Ratio) has not been formerly defined in the 3GPP specification.
6 The UE does not send the results back to eNB. The SINR is measured and used only in UE. Specific
7 implementations may vary, and it is up to the manufacturer to decide, how to implement this measurement. This
8 is making difficult to compare results of different devices. In [7], SINR is defined as the linear average over the
9 power contribution (in [W]) of the resource elements carrying cell-specific reference signals divided by the linear
10 average of the noise and interference power contribution (in [W]) over the resource elements carrying cell-
11 specific reference signals within the same frequency bandwidth.

12 • 5G SS-RSRP (Synchronization Signal based Reference Signal Received Power) [7] is defined as the linear
13 average over the power contributions (in [W]) of the resource elements that carry secondary synchronization
14 (SS) signals. SS-RSRP is the equivalent of the RSRP parameter used in LTE systems. The reporting range of
15 SS-RSRP is defined from -140dBm to -40dBm [9].

16 • 5G SS-SINR [7] is defined as the linear average over the power contribution (in [W]) of the resource elements
17 carrying secondary synchronisation signals divided by the linear average of the noise and interference power
18 contribution (in [W]) over the resource elements carrying secondary synchronisation signals within the same
19 frequency bandwidth. The SS-SINR is reported from UE back to gNB. The reporting range of SS-SINR is
20 defined from -23dB to 40dB.

21 It is worth to note that in 5G, the Channel State Information Reference Signal (CSI-RS) can also be used for RSRP and
22 SINR measurements. Due to different transmit powers of CSI-RS and SS, CSI-RS-based SINR and RSRP measurement
23 values are usually greater than SS-based SINR and RSRP.

24 The minimum coupling loss (MCL) [2] between O-RU (antenna) and UE must be ensured:

25 • Macro cell deployment scenario (wide area BS): MCL = 70dB corresponding to minimal O-RU (antenna) to
26 UE distance along the ground equal to around 35 m

27 • Small cell (micro cell) deployment scenario (medium range BS): MCL = 53dB corresponding to minimal O-RU
28 (antenna) to UE distance along the ground equal to around 5 m

29 • Pico cell deployment scenario (local area BS): MCL = 45dB corresponding to minimal O-RU (antenna) to UE
30 distance along the ground equal to around 2 m

31 The radio parameters (RSRP, SINR) should be measured across the entire range covering scenarios from cell centre to
32 cell edge. Based on the test results, the RSRP and SINR distribution statistics can be calculated and described as a
33 cumulative distribution function (CDF) curve. According to the CDF curve, the four types of radio conditions can be
34 defined as: excellent (95%-100%), good (80%-90%), fair (40%-60%) and poor (5%-15%) [3]. Table 3-2 shows the RSRP
35 and SINR thresholds for various radio conditions.Table 3-2 RSRP and SINR thresholds for various radio conditions

36 Note that the RSRP values should primarily be used for UL assessments, and the SINR values for DL assessment.

37

38 Table 3-2 RSRP and SINR thresholds for various radio conditions

Radio RSRP (dBm) DL SINR (dB)


conditions SS-RSRP (dBm) DL SS-SINR (dB)
Excellent > -75 > 25 • Utilization of the highest possible MCS,
(cell centre) transport block size and MIMO rank
• Peak performance measurements
• Negligible interference from neighbor cells

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 23
O-RAN.TIFG.E2E-Test.0-v04.00

Good -75 to -90 15 to 20


(typical value = -85) (typical value = 17)
Fair -90 to -105 5 to 10
(typical value = -95) (typical value = 7)
Poor < -105 <5 • Minimum performance measurements
(cell edge) (typical value = -110) (typical value = 3) • Strong interference from neighbor cells
1 There are many different factors that influence signal strength and quality during the field testing; these factors include,
2 but are not limited, to the following:

3 • Proximity to the cellular tower (antenna)

4 • Load in neighbor cells

5 • Surrounding physical barriers (mountains, buildings, etc.)

6 • Weather conditions

7 3.7 Inter-cell interference


8 The tests are conducted either in a single cell scenario without any inter-cell interference or in a multi-cell scenario where
9 the serving cell is surrounded by neighboring cells generating traffic load (interference on the serving cell) in the downlink
10 or uplink directions.

11 Generating a realistic traffic load is important for meaningful results. As the number of real UEs are always a limiting
12 factor, artificial (dummy) traffic load and interference generation could be used.

13 In the single cell scenario, the serving cell is isolated, and all the surrounding neighbor cells are turned off (neither control
14 nor data channels are used).

15 In the multi-cell scenario, all the neighbor cells are turned on. The following load setups are defined:

16 • Load 0% - all the surrounding neighbor cells are turned on without any data traffic and end-user device attached.
17 Inter-cell interference is generated only on control channels (broadcasting, synchronization channels) without
18 any inter-cell interference on data channels.

19 • Load 30% - all the surrounding neighbor cells are turned on with data traffic. Inter-cell interference is generated
20 both on control and data channels. The level of interference on data channel is controlled by the amount of data
21 traffic. The interference level of 30% in downlink means that 30% of downlink PRBs are randomly occupied
22 with a dummy traffic. In uplink, this interference level corresponds to 3dB rise of IoT (Interference over
23 Thermal) noise at the receiver side (i.e. eNB/gNB antenna(s)). The received interference noise from the UEs of
24 neighbor cells uplink transmission should lead to 3dB rise of receiver’s noise power [3].

25 • Load 50% - all the surrounding neighbor cells are turned on with data traffic. Inter-cell interference is generated
26 both on control and data channels. The level of interference on data channel is controlled by the amount of data
27 traffic. The interference level of 50% in downlink means that 50% of downlink PRBs are randomly occupied
28 with a dummy traffic. In uplink, this interference level corresponds to 5dB rise of IoT (Interference over
29 Thermal) noise at the receiver side (i.e. eNB/gNB antenna(s)).

30 • Load 70% - all the surrounding neighbor cells are turned on with data traffic. Inter-cell interference is generated
31 both on control and data channels. The level of interference on data channel is controlled by the amount of data
32 traffic. The interference level of 70% in downlink means that 70% of downlink PRBs are randomly occupied
33 with a dummy traffic. In uplink, this interference level corresponds to 7dB rise of IoT (Interference over
34 Thermal) noise at the receiver side (i.e. eNB/gNB antenna(s)).

35 • Load 100% - fully loaded multi-cell scenario generating the highest possible inter-cell interference. All the
36 surrounding neighbor cells are turned on with data traffic. Inter-cell interference is generated both on control and

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 24
O-RAN.TIFG.E2E-Test.0-v04.00

1 data channels. The level of interference on data channel is controlled by the amount of data traffic. The
2 interference level of 100% in downlink means that 100% of downlink PRBs are occupied with a dummy traffic.
3 In uplink, this interference level corresponds to 9dB rise of IoT (Interference over Thermal) noise at the receiver
4 side (i.e. eNB/gNB antenna(s)).

5 3.8 Spectral efficiency


6 The spectral (or spectrum) efficiency (SE) is an important criterion for fair performance assessment and benchmarking
7 of different systems when various transmission bandwidths, duplex modes (FDD/TDD) and TDD DL/UL configurations
8 are normalized.

9 The spectral efficiency is calculated by dividing the data throughput by the aggregated channel bandwidth (incl. guard
10 bands) in DL or UL assuming single user and FDD duplex mode. The corresponding link frame structure is fully (100%)
11 utilized in frequency and time domains.

𝐷𝐿 𝑑𝑎𝑡𝑎 𝑡ℎ𝑟𝑜𝑢𝑔ℎ𝑝𝑢𝑡 [𝑏𝑝𝑠]


12 𝑆𝐸!""_"$ [𝑏𝑝𝑠⁄𝐻𝑧] =
𝐷𝐿 𝑐ℎ𝑎𝑛𝑛𝑒𝑙 𝑏𝑎𝑛𝑑𝑤𝑖𝑑𝑡ℎ [𝐻𝑧]

𝑈𝐿 𝑑𝑎𝑡𝑎 𝑡ℎ𝑟𝑜𝑢𝑔ℎ𝑝𝑢𝑡 [𝑏𝑝𝑠]


13 𝑆𝐸!""_%$ [𝑏𝑝𝑠⁄𝐻𝑧] =
𝑈𝐿 𝑐ℎ𝑎𝑛𝑛𝑒𝑙 𝑏𝑎𝑛𝑑𝑤𝑖𝑑𝑡ℎ [𝐻𝑧]

14 In case of TDD where the same spectrum is used at different times for the uplink and downlink, the spectral efficiency is
15 in addition multiplied by the fraction of resources (slots and symbols, not including switching gap) allocated to the
16 particular link direction within 10ms radio frame.

𝐷𝐿 𝑑𝑎𝑡𝑎 𝑡ℎ𝑟𝑜𝑢𝑔ℎ𝑝𝑢𝑡 [𝑏𝑝𝑠] 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑡𝑜𝑡𝑎𝑙 𝑠𝑦𝑚𝑏𝑜𝑙𝑠 𝑝𝑒𝑟 𝑓𝑟𝑎𝑚𝑒


17 𝑆𝐸&""_"$ [𝑏𝑝𝑠⁄𝐻𝑧] = ∗
𝑐ℎ𝑎𝑛𝑛𝑒𝑙 𝑏𝑎𝑛𝑑𝑤𝑖𝑑𝑡ℎ [𝐻𝑧] 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝐷𝐿 𝑠𝑦𝑚𝑏𝑜𝑙𝑠 𝑝𝑒𝑟 𝑓𝑟𝑎𝑚𝑒

18

𝑈𝐿 𝑑𝑎𝑡𝑎 𝑡ℎ𝑟𝑜𝑢𝑔ℎ𝑝𝑢𝑡 [𝑏𝑝𝑠] 𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑠𝑦𝑚𝑏𝑜𝑙𝑠 𝑝𝑒𝑟 𝑓𝑟𝑎𝑚𝑒


19 𝑆𝐸&""_%$ [𝑏𝑝𝑠⁄𝐻𝑧] = ∗
𝑐ℎ𝑎𝑛𝑛𝑒𝑙 𝑏𝑎𝑛𝑑𝑤𝑖𝑑𝑡ℎ [𝐻𝑧] 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑈𝐿 𝑠𝑦𝑚𝑏𝑜𝑙𝑠 𝑝𝑒𝑟 𝑓𝑟𝑎𝑚𝑒

20

21

22

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 25
O-RAN.TIFG.E2E-Test.0-v04.00

1 Functional tests
2 This chapter describes the tests evaluating and assessing the functionality of the radio access network from a network
3 end-to-end perspective. The focus of the testing is on the end-user functionality based on 3GPP and O-RAN
4 specifications. Pass-fail criteria are defined for the tests wherever possible.

5 The general test methodologies and configurations are mentioned in Chapter 3.

6 Unless otherwise stated in the chapter, the tests are suitable and can be conducted in both laboratory as well as field
7 environments, with pros and cons of both environments as described in Chapter 3.

8 The following end-to-end functional tests are defined in this chapter as an extension of the NGMN testing framework [3]:

9 Table 4-1: E2E Functionality Test Case Summary

Applicable technology

NSA
LTE

SA
Test case Functional group

Test
E2E Functionality Assessment
ID
4.1 LTE/5G NSA Attach and detach of single UE Accessibility Y Y N/A
4.2 LTE/5G NSA Attach and detach of multiple UEs Accessibility Y Y N/A
5G SA Registration and deregistration of single UE N/A N/A Y
4.3 Accessibility
4.4 Intra-Distributed Unit mobility Mobility N/A Y Y
4.5 Inter-Distributed Unit mobility Mobility N/A Y Y
4.6 Inter-Central Unit mobility Mobility N/A Y Y
4.7 5G SA registration and deregistration to Single network slices Accessibility N/A N/A Y
4.8 5G SA registration and deregistration to multiple network slices Accessibility N/A N/A Y
4.9 Idle Mode Intra- O-DU mobility Mobility N/A N/A Y
4.10 Idle Mode Inter- O-DU mobility Mobility N/A N/A Y
4.11 Idle Mode Inter- O-CU mobility Mobility N/A N/A Y
4.12 5G/4G Inter-RAT mobility - 5G to LTE handover Mobility Y N/A Y
4.13 5G/4G Inter-RAT mobility - LTE to 5G handover Mobility Y N/A Y
10

11 The test description, setup and procedures are detailed in the following sections for each test case.

12 4.1 LTE/5G NSA attach and detach of single UE

13 4.1.1 Test description and applicability

14 The purpose of this test is to validate E2E O-RAN C-plane functionality with a single UE. These tests are valid for
15 either LTE or 5G NSA. In this test scenario, the successful attach and detach procedure shall be validated by the “Power
16 ON” and “Power OFF” of a single UE, as described in the following specifications:

17 1. LTE Attach as per 3GPP TS 23.401 [29], Section 5.3.2.1 E-UTRAN Initial Attach

18 2. LTE Detach as per 3GPP TS 23.401 [29], Section 5.3.8.2.1 UE-initiated Detach procedure for E-UTRAN

19 3. 5G NSA Attach as per 3GPP TS 23.401 [29], Section 5.3.2.1 E-UTRAN Initial Attach and 3GPP TS 37.340
20 [30] Section 10.2.1 EN-DC (Secondary Node Addition)

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 26
O-RAN.TIFG.E2E-Test.0-v04.00

1 4. 5G NSA Detach as per 3GPP 23.401, Section 5.3.8.2.1 UE-initiated Detach procedure for E-UTRAN and
2 3GPP TS 37.340 [30] Section 10.4.1 EN-DC (Secondary Node Release) [29]

3 The test procedure shall be performed in excellent radio conditions for 10 iterations. Attach success rate, detach success
4 rate, and attach latency shall be measured and captured.

5 4.1.2 Test setup and configuration

6 The test setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with a
7 stationary UE (real or emulated) placed under excellent radio conditions as defined in Section 3.6, using LTE RSRP
8 (for LTE) or 5G SS-RSRP (for 5G NSA) as the metric. Within the cell there should be only one active UE. The test is
9 suitable for lab as well as field environments.

10 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
11 recorded in the test report.

12 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator/fading
13 generator inserted between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using
14 a UE emulator. The test environment should be setup to achieve excellent radio conditions (LTE RSRP (for LTE) or 5G
15 SS-RSRP (for 5G NSA) as defined in Section 3.6) for the UE, but the minimum coupling loss (see Section 3.6) should
16 not be exceeded. The UE should be placed inside an RF shielded box or RF shielded room if the UE is not connected via
17 cable.

18 Field setup: The UE is placed in the centre of the cell close to the radiated eNB/gNB antenna(s), where excellent radio
19 conditions (LTE RSRP (for LTE) or 5G SS-RSRP (for 5G NSA) as defined in Section 3.6) should be observed. The
20 minimum coupling loss (see Section 3.6) should not be exceeded.

21 Please refer to Figure 5-1 for the E2E test setups for LTE and 5G NSA.

22 4.1.3 Test Procedure


23 The test steps below are applicable for either LTE or 5G NSA:

24 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
25 report. The serving cell under test is activated and unloaded. All other cells are powered off.

26 2. The UE (real or emulated UE) is placed under excellent radio conditions (Cell centre close to radiated eNB/gNB
27 Antenna) as defined by LTE RSRP (for LTE) or 5G SS-RSRP (for 5G NSA) in Section 3.6.

28 3. The End-to-end setup shall be operational for LTE or 5G NSA as applicable for the test scenario, and there should
29 not be any connectivity issues.

30 4. Start the logs to capture the call flow and signalling messages

31 5. “Power ON” the UE to attach to the LTE or 5G NSA cell. Wait for a successful attach.

32 6. “Power OFF” the connected UE to detach from the network. Wait for a successful detach.

33 7. Stop and save the test logs. The logs should be captured and kept for test result reference and measurements

34 8. Repeat steps 4 to 7, for a total of 10 times and record the KPIs mentioned in Section 4.1.4.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 27
O-RAN.TIFG.E2E-Test.0-v04.00

1 4.1.4 Test requirements (expected results)

2 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
3 should be captured and reported in the test report for performance assessment

4 • Radio parameters such as RSRP, RSRQ

5 • KPIs mentioned in Table 4-2 and Table 4-3

6 Validate the successful procedures from the collected logs. Expected success rate for Attach/Detach and Secondary Node
7 Addition/Release KPI is 100%. The attach-detach procedure should pass 10 consecutive times to mark the test case as
8 passing. The gap analysis should be provided for the measured and the expected target KPIs.

9 • LTE Attach-Detach test case validation and KPI measurements

10 • Validate successful attach 10 times with LTE cell (Refer to 3GPP TS 23.401 [29], Section 5.3.2.1 E-
11 UTRAN Initial Attach). In the UE logs or applications installed on UE, check that UE is attached to
12 correct cell (example PCI, Global eNB ID, ARFCN as per test configuration)

13 • Measure the attach success rate by validating attach request and attach complete for each iteration.
14 Record the attach success rate in Table 4-2

15 • Measure the attach latency by calculating the time between attach request to attach complete. Capture
16 the latency for each iteration and sort the latency value observed for each iteration in ascending order.
17 Record the Minimum, Average (Sum of all latency value/ Total Iterations, Total Iterations are 10 in this
18 case) and Maximum latency value observed in Table 4-3

19 • Validate successful detach attach with LTE cell (Refer to 3GPP TS 23.401 [29], Section 5.3.8.2.1 UE-
20 initiated Detach procedure for E-UTRAN). Signalling connection release should be validated from
21 message flow (UE context release and RRC connection release messages). Measure the detach success
22 rate by validating detach request and detach accept for each iteration. Record the detach success rate in
23 Table 4-2

24 • 5G NSA Attach-Detach test case validations and KPI measurements

25 • Validate successful multiple attaches with 5G NSA cell (3GPP TS 23.401 [29], Section 5.3.2.1 E-
26 UTRAN Initial Attach and 3GPP TS 37.340 [30] Section 10.2.1 EN-DC for Secondary Node Addition)).
27 In the UE logs or applications installed on UE, check that UE is attached to correct cell (example PCI,
28 Global eNB ID/Global gNB, ARFCN/NR-ARFCN as per test configuration for LTE /5G cells )

29 • Measure the attach success rate by validating attach request and attach complete for each iteration. Also
30 measure the secondary node addition success rate by validating the SgNB addition request and SgNB
31 reconfiguration complete as per flow 3GPP TS 37.340 [30] Section 10.2.1 EN-DC for each iteration.
32 Record the attach success rate and secondary node addition success rate in Table 4-2

33 • Validate successful detach attach (LTE Detach and 5G Secondary Node release) with 5G NSA cell
34 (Refer to 3GPP TS 23.401 [29], Section 5.3.8.2.1 UE-initiated Detach procedure for E-UTRAN and
35 3GPP TS 37.340 [30] Section 10.4.1 EN-DC for Secondary Node Release)). Signalling connection
36 release should be validated the from message flow (UE context release and RRC connection release
37 messages). 5G secondary node should also get released successfully. Measure the detach success rate
38 by validating detach request and detach accept for each iteration. Record the detach success rate in Table
39 4-2

40 Table 4-2 KPI to be captured for single UE attach-detach test case

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 28
O-RAN.TIFG.E2E-Test.0-v04.00

LTE KPI 5G NSA KPI

Attach Success Detach Success Attach Detach SgNB addition


Rate Rate Success Rate Success Rate Success rate

2 Table 4-3 Latency KPI for attach

LTE Attach Time (millisecond)

Minimum Maximum Average

3 4.2 LTE/5G NSA attach and detach of multiple UEs

4 4.2.1 Test description and applicability

5 The purpose of this test is to validate E2E O-RAN C-plane functionality with multiple UEs. These tests are valid for
6 either LTE or 5G NSA. In this test scenario, the successful attach and detach procedure shall be validated by the “Power
7 ON” and “Power OFF” of multiple (at least 2) UEs, as described in the following specifications:

8 1. LTE Attach as per 3GPP TS 23.401 [29], Section 5.3.2.1 E-UTRAN Initial Attach

9 2. LTE Detach as per 3GPP TS 23.401 [29], Section 5.3.8.2.1 UE-initiated Detach procedure for E-UTRAN

10 3. 5G NSA Attach as per 3GPP TS 23.401 [29], Section 5.3.2.1 E-UTRAN Initial Attach and 3GPP TS 37.340
11 [30] Section 10.2.1 EN-DC (Secondary Node Addition)

12 4. 5G NSA Detach as per 3GPP 23.401, Section 5.3.8.2.1 UE-initiated Detach procedure for E-UTRAN and
13 3GPP TS 37.340 [30] Section 10.4.1 EN-DC (Secondary Node Release) [29]

14 The test procedure shall be performed in excellent radio conditions for 10 iterations. Attach success rate, detach success
15 rate, and attach latency shall be measured and captured.

16 4.2.2 Test setup and configuration

17 The network setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with
18 multiple stationary UEs (real or emulated) placed under excellent radio conditions as defined in Section 3.6, using LTE
19 RSRP (for LTE) or 5G SS-RSRP (for 5G NSA) as the metric. The test is suitable for lab as well as field environments.

20 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
21 recorded in the test report.

22 Laboratory setup: The radio conditions experienced by the UEs can be modified using variable attenuators/fading
23 generator inserted between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using
24 a UE emulator. The test environment should be setup to achieve excellent radio conditions (LTE RSRP (for LTE) or 5G
25 SS-RSRP (for 5G NSA) as defined in Section 3.6) for the UEs, but the minimum coupling loss (see Section 3.6) should
26 not be exceeded. The UEs should be placed inside an RF shielded box or RF shielded room if the UEs are not connected
27 via cable.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 29
O-RAN.TIFG.E2E-Test.0-v04.00

1 Field setup: The multiple UEs are placed in the centre of the cell close to the radiated eNB/gNB antenna(s), where
2 excellent radio conditions (LTE RSRP and 5G SS-RSRP as defined in Section 3.6) should be observed. The minimum
3 coupling loss (see 3.6) should not be exceeded.

4 Please refer to Figure 5-1 for the E2E test setups for LTE and 5G NSA.

5 4.2.3 Test Procedure


6 The test steps below are applicable for either LTE or 5G NSA:

7 1. The test setup is configured according to the test configuration. The test configuration should be recorded in
8 the test report. The serving cell under test is activated and unloaded. All other cells are powered off.

9 2. The multiple UEs (real or emulated) are placed under excellent radio conditions (Cell centre close to radiated
10 eNB/gNB Antenna) as defined by LTE RSRP (for LTE) or 5G SS-RSRP (for 5G NSA) in Section 3.6.

11 3. The End-to-end setup shall be operational for LTE or 5G NSA as applicable for the test scenario, and there
12 should not be any connectivity issues.

13 4. Start the logs to capture the call flow and signalling messages

14 5. “Power ON” the multiple connected UEs to attach to the LTE or 5G NSA cell. Wait for the successful attach
15 of all UEs.

16 6. “Power OFF” the multiple UEs to detach from the network. Wait for the successful detach of all UEs

17 7. Stop and save the test logs. The logs should be captured and kept for test result reference and measurements

18 8. Repeat steps 4 to 7, for a total of 10 times and record the KPIs mentioned in Section 4.2.4

19 4.2.4 Test requirements (expected results)

20 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
21 should be captured and reported in the test report for performance assessment

22 • Radio parameters such as RSRP, RSRQ

23 • KPIs mentioned in Table 4-4 and Table 4-5

24 Validate the successful procedures for each UE from the collected logs. Expected success rate for Attach/Detach and
25 Secondary Node Addition/Release KPI is 100%. The attach-detach procedure should pass 10 consecutive times for all
26 UEs to mark the test case as passing. The gap analysis should be provided for the measured and the expected target KPIs.

27 • LTE Attach-Detach test case validations and KPI measurements

28 • Validate successful attach of each UE 10 times with LTE cell (Refer to 3GPP TS 23.401 [29], Section
29 5.3.2.1 E-UTRAN Initial Attach). In the UE logs or applications installed on UE, check that UE is
30 attached to correct cell (example PCI, Global eNB ID, ARFCN as per test configuration)

31 • Measure the attach success rate by validating attach request and attach complete for each iteration.
32 Record the attach success rate in Table 4-4 for each UE.

33 • Measure the attach latency by calculating the time between Attach Request to attach complete. Capture
34 the latency for each iteration and sort the latency value observed for each iteration in ascending order.
35 Record the Minimum, Average (Sum of all latency value/ Total Iterations, Total Iterations are 10 in this
36 case) and Maximum latency value observed in Table 4-5 for each UE

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 30
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Validate successful detach attach with LTE cell (Refer to 3GPP TS 23.401 [29], Section 5.3.8.2.1 UE-
2 initiated Detach procedure for E-UTRAN). Signalling connection release should be validated from
3 message flow (UE context release and RRC connection release messages). Measure the detach success
4 rate by validating detach request and detach accept for each iteration. Record the detach success rate in
5 Table 4-4

6 • 5G NSA Attach-Detach test cases validations and KPI measurements

7 • Validate successful multiple UE attach with 5G NSA cell (3GPP TS 23.401 [29], Section 5.3.2.1 E-
8 UTRAN Initial Attach and 3GPP TS 37.340 [30] Section 10.2.1 EN-DC for Secondary Node Addition)).
9 In the UE logs or applications installed on UE, check that UE is attached to correct cell (example PCI,
10 Global eNB ID/Global gNB, ARFCN/NR-ARFCN as per test configuration for LTE /5G cells )

11 • Measure the attach success rate by validating attach request and attach complete for each iteration. Also
12 measure the secondary node addition success rate by validating the SgNB addition request and SgNB
13 reconfiguration complete as per flow 3GPP 37.340 Section 10.2.1 EN-DC for each iteration. Record the
14 attach success rate and secondary node addition success rate in Table 4-4 for each UE

15 • Validate successful detach attach (LTE Detach and 5G Secondary Node release) with 5G NSA cell
16 (Refer to 3GPP TS 23.401 [29] Section, 5.3.8.2.1 UE-initiated Detach procedure for E-UTRAN and
17 3GPP TS 37.340 [30] Section 10.4.1 EN-DC for Secondary Node Release)). Signalling connection
18 release should be validated from the message flow (UE context release and RRC connection release
19 messages). 5G secondary node should also get released successfully. Measure the detach success rate
20 by validating detach request and detach accept for each iteration. Record the detach success rate in Table
21 4-4 for each UE

22 Table 4-4 KPI to be captured for multi-UE attach-detach test case

LTE KPI 5G NSA KPI


Attach Detach SgNB
Attach Detach
Success Success addition
Success Rate Success Rate
Rate Rate Success rate

23

24 Table 4-5 Latency KPI for multi-UE attach

LTE Attach Time (millisecond)

Minimum Maximum Average

25 4.3 5G SA registration and deregistration of single UE

26 4.3.1 Test description and applicability

27 The purpose of the test is to verify the full registration and de-registration procedure with a single UE. The test also
28 verifies the PDU session establishment and release procedures.

29 The test focuses on the procedure of ‘Initial registration’ as defined in 3GPP TS 23.502 [28] Section 4.2.2.2.2.

30 The test focuses on the procedure of ‘UE-initiated de-registration’ as defined in 3GPP TS 23.502 [28] Section 4.2.2.3.2.

31 This test also validates PDU session establishment and release procedures.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 31
O-RAN.TIFG.E2E-Test.0-v04.00

1 The test validates the 3GPP standard registration/de-registration procedure and the latency of the procedure. Bi-
2 directional data transmission shall be observed before the de-registration procedure to verify the stability of the network
3 slice.

4 4.3.2 Test setup and configuration

5 The test setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with a
6 stationary UE (real or emulated) placed under excellent radio conditions as defined in Section 3.6, using SS-RSRP as
7 the metric. Within the cell there should be only one active UE. The application server should be placed as close as
8 possible to the core/core emulator and connected to the core/core emulator via a transport link with enough capacity so
9 as not to limit the expected data throughput. The UE, RAN, and 5G Core shall support the network slicing, at least for
10 one Single Network Slice Selection Assistance Information (S-NSSAI). The test is suitable for lab as well as field
11 environments.

12 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
13 recorded in the test report.

14 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator/fading
15 generator inserted between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using
16 a UE emulator. The test environment should be setup to achieve excellent radio conditions (using SS-RSRP as defined in
17 Section 3.6) for the UE, but the minimum coupling loss (see Section 3.6) should not be exceeded. The UE should be
18 placed inside an RF shielded box or RF shielded room if the UE is not connected via cable.

19 Field setup: The UE is placed in the centre of cell close to the radiated eNB/gNB antenna(s), where excellent radio
20 conditions (SS-RSRP as defined in Section 3.6) should be observed. The minimum coupling loss (see 3.6) should not be
21 exceeded.

22 Please refer Figure 5-1 for E2E test setup for 5G SA.

23 4.3.3 Test Procedure

24 Below are the test procedure steps

25 1. The test setup is configured according to the test configuration. The test configuration should be recorded in
26 the test report. The serving cell under test is activated and unloaded. All other cells are powered off.

27 2. Power on the UE and the UE shall send REGISTRATION REQUEST message. UE shall successfully register
28 to the 5G SA network.

29 3. Full-buffer UDP bi-directional data transmission (see Section 3.4) between the application server and UE is
30 initiated.

31 4. The registration procedure messages shall be captured, and the latency of the registration procedure shall be
32 measured and recorded in Table 4-6. The duration of the test should be at least 3 minutes when the throughput
33 is stable. The PDU session establishment procedure messages shall also be captured and verified.

34 5. Power off the UE and UE shall send DEREGISTRATION REQUEST message. UE shall successfully de-
35 register from the 5G SA network.

36 6. The de-registration procedure messages shall be captured, and the latency of de-registration procedure shall be
37 measured and recorded in Table 4-6. The PDU session release procedure messages shall also be captured and
38 verified.

39 7. Repeat steps 2 to 6, for a total of 10 times and record the KPIs mentioned in Table 4-6.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 32
O-RAN.TIFG.E2E-Test.0-v04.00

1 4.3.4 Test requirements (expected results)

2 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
3 should be captured and reported in the test report for the performance assessment.

4 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

5 • Latency KPI mentioned in Table 4-6

6 Validate from collected logs registration (as per 3GPP TS 23.502 [28] Section 4.2.2.2.2) and deregistration (as per
7 3GPP TS 23.502 [28] Section 4.2.2.3.2) procedures and also validate ‘UE Requested PDU Session Establishment for
8 Non-roaming and Roaming with Local Breakout case’ as defined in 3GPP TS 23.502 [28] Section 4.3.2.2.1, and the
9 procedure of ‘PDU Session Release for UE or network requested PDU Session Release for Non-Roaming and Roaming
10 with Local Breakout case’ as defined in 3GPP TS 23.502 [28] Section 4.3.4.2. The procedure should pass 10
11 consecutive times to mark the test case as passing. The gap analysis should be provided for the measured and the
12 expected target KPIs.

13 The Registration Time latency is measured by calculating the time between Registration Request to Registration
14 Complete; The De-registration Time latency is measured by calculating the time between De-registration Request to
15 Signaling Connection Release. Capture the latency for each iteration and sort the latency value observed for each
16 iteration in ascending order. Record the Minimum, Average (Sum of all latency value/ Total Iterations, Total Iterations
17 are 10 in this case) and Maximum latency value observed in Table 4-6.1.

18 Table 4-6 5G SA registration/de-registration latency KPI record table of single UE

KPI Repeat Times Calculation

1 2 3 4 5 6 7 8 9 10 Minimum Maximum Average

Registration
Time (single
slice)
(millisecond)

De-
registration
Time (single
slice)
(millisecond)

19 4.4 Intra-O-DU mobility

20 4.4.1 Test description and applicability

21 The purpose of the test is to verify intra O-CU, intra O-DU handover of a UE. The test validates the O-CU, O-DU, and
22 O-RU functionality in handling inter-cell mobility when two O-RUs are connected to an O-DU. The test measures the
23 DL / UL throughput variations, handover latency, handover interruption and packet loss during the mobility. Test
24 scenarios are classified into two groups as Standalone (SA) and Non-Standalone (NSA).

25 Intra O-DU mobility with SA shall follow 3GPP TS 38.401 [29], Section 8.2.1 and 3GPP TS 38.473 [32], Section 8.3.4
26 for the call flow. Intra O-DU mobility with NSA shall follow 3GPP TS 38.401 [29], Section 8.2.1, 3GPP TS 38.473
27 [32], Section 8.3.4 and 3GPP TS 37.340 [30] for the call flow.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 33
O-RAN.TIFG.E2E-Test.0-v04.00

1 4.4.2 Test setup and configuration

2 In NSA mode, the test setup consists of one 4G cell (MeNB) and two 5G cells (SgNB). Each 5G cell is associated with
3 a single O-DU, connected to a single O-CU, refer to Figure 4-1 for the test setup topology. The test environment shall
4 have a single UE with active data traffic. The application server should be placed as close as possible to the core and
5 connected to the core via a transport link with enough capacity.

7 Figure 4-1 Intra O-DU mobility test bed for NSA mode of operation.

8 In SA, the test setup consists of two 5G cells, each one associated with the same O-DU and O-CU connected to a 5G
9 core network (see Figure 4-2 for the test setup topology). The test environment shall have a single UE with active data
10 traffic. The application server should be placed as close as possible to the core and connected to the core via a transport
11 link with enough capacity.

12

13 Figure 4-2 Intra O-DU mobility test bed for SA mode of operation.

14 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
15 recorded in the test report.

16 Laboratory setup: The radio conditions of UE can be modified by a variable attenuator/fading generator inserted
17 between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using a UE emulator.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 34
O-RAN.TIFG.E2E-Test.0-v04.00

1 The radio conditions of the UE are initially set to excellent using RSRP as the metric. The minimum coupling loss (see
2 Section 3.6) should not be exceeded. The UE should be placed inside and RF shielded box or RF shielded room if the
3 UE is not connected via cable. The UE handover between the cells can be achieved by changing the radio signal
4 strength of the source and target cells using variable attenuators.
5

6 Field setup: The drive route with source and target cells should be defined. The UE is placed in the centre of cell close
7 to the radiated eNB/gNB antenna(s), where excellent radio conditions (using RSRP as the metric as defined in Section
8 3.6) should be observed. The minimum coupling loss (see 3.5) should not be exceeded. The change in radio conditions is
9 achieved by moving the UE along the drive route from source cell to target cell.

10 4.4.3 Test Procedure


11 Below are the NSA mode steps

12 1. The 4G and 5G cell setups are configured following Section 3.2.

13 2. All the three cells are configured according to the test configuration. The cells are activated and unloaded.

14 3. Both 5G cells are configured as neighbors to each other, so that the UE can trigger measurement events for
15 handover.

16 4. The source (cell 1) and target (cell 2 ) 5G cells for intra O-DU mobility shall be depicted as in Figure 4-1.

17 5. The test UE is under source 5G cell coverage.

18 6. Power on the UE and UE shall successfully complete the LTE attach followed by successful SgNB addition to
19 source 5G cell.

20 7. The full-buffer UDP bi-directional data transmission (see 3.4) from the application server is initiated.

21 8. The UE shall move from the source 5G cell to the target 5G cell to trigger a handover.

22 Below are the SA Mode steps

23 1. The 5G cell setup is configured following Section 3.2.

24 2. Configure two 5G cells within an O-DU according to the test configuration. The cells are activated and
25 unloaded.

26 3. Both 5G cells are configured as neighbors to each other, so that the UE can trigger measurement events for
27 handover.

28 4. The source and target 5G cells for intra O-DU mobility shall be depicted as in Figure 4-2.

29 5. The test UE is under source cell coverage.

30 6. Power on the UE and UE shall successfully register to source 5G cell.

31 7. The full-buffer UDP bi-directional data transmission (see 3.4) from the application server is initiated.

32 8. The UE shall move from source cell to target cell to perform handover.

33 4.4.4 Test requirements (expected results)

34 The intra O-DU handover call flow shall be verified for both NSA and SA use cases. Following functionalities shall
35 also be validated:

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 35
O-RAN.TIFG.E2E-Test.0-v04.00

1 • PDU Session is established when full-buffer bi-directional data transmission is initiated. (Only in SA Mode)
2 • Handover is successful.

3 In addition to the common minimum set of configuration parameters defined (see Section 3.3), the following metrics
4 and counters shall be recorded and reported for the performance assessment.

5 eNB/gNB/Application server side:

6 • Transmit downlink throughput measured at application server in time (average per second)
7 • Received uplink throughput measured at application server in time (average per second)
8 • Uplink packet loss percentage during handover.
9 • Uplink BLER, MCS, MIMO rank (RI) on PUSCH in time (average per second)

10 UE side:

11 • Radio parameters such as RSRP, RSRQ, SINR on PDSCH in time (average per second)
12 • Downlink BLER, MCS, MIMO rank (RI) on PDSCH in time (average per second).
13 • Received Downlink throughput (L1 and L3 PDCP layers) in time (average per second).
14 • Downlink packet loss percentage during handover
15 • Uplink throughput (L1 and L3 PDCP layers) in time (average per second)
16 • Channel utilization, i.e. Number allocated/occupied downlink and uplink RBs in time (per TTI/average per
17 second) and Number of allocated/occupied slots in time.
18 • KPIs related to Handover failure, Call drop, Handover latency, Handover interruption time.

19 4.5 Inter-O-DU mobility

20 4.5.1 Test description and applicability

21 The purpose of the test is to verify intra O-CU, inter O-DU handover of a UE. The test validates the O-CU, O-DU
22 functionality in handling Inter O-DU handover. The test measures the DL / UL throughput variations, handover latency,
23 handover interruption and packet loss during the handover procedure. Test scenarios are classified into two groups as
24 SA (Standalone) and NSA (Non-Standalone).

25 Inter O-DU mobility with SA shall follow 3GPP TS 38.401 [31], Section 8.2.1 and Inter O-DU mobility with NSA shall
26 follow 3GPP TS 38.401[31], Section 8.2.2 for the call flow.

27 3GPP 38.401 v15.7.0 has introduced a new CR 0104, which has modified the initial part of call flow for Section 8.2.2.
28 The ORAN system supporting 3GPP specification later than v15.7.0 shall follow Section 8.2.2 of 3GPP TS 38.401
29 v15.7.0 or later to verify the inter O-DU handover.

30 4.5.2 Test setup and configuration

31 In Non-Standalone Mode, the test setup consists of one 4G Cell (MeNB) and two 5G cells (SgNB). Each 5G Cell is
32 associated with different O-DUs, connected to the same O-CU. The test environment shall have a single UE with active
33 data traffic. The application server should be placed as close as possible to the core and connected to the core via a
34 transport link with enough capacity.

35

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 36
O-RAN.TIFG.E2E-Test.0-v04.00

2 Figure 4-3 Inter O-DU mobility test bed for NSA mode of operation.

3 In standalone Mode, the test setup consists of two 5G cells, each one associated with a different O-DU, connected to the
4 same O-CU. The test environment shall have a single UE with active data traffic. The application server should be
5 placed as close as possible to the core and connected to the core via a transport link with enough capacity.

7 Figure 4-4 Inter O-DU mobility test bed for SA mode of operation

9 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
10 recorded in the test report.

11 Laboratory setup: The radio conditions of UE can be modified by a variable attenuator/fading generator inserted
12 between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using a UE emulator.
13 The radio conditions of UE are initially set to excellent. The minimum coupling loss (see Section 3.6) should not be
14 exceeded. The UE should be placed inside RF shielded box or RF shielded room if the UE is not connected via cable.
15 The UE handover between the cells can be achieved by changing radio signal strength of source and target cells using
16 variable attenuators.
17

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 37
O-RAN.TIFG.E2E-Test.0-v04.00

1 Field setup: The drive route with source and target cells should be defined. The UE is placed in the centre of cell close
2 to the radiated eNB/gNB antenna(s), where excellent radio conditions (RSRP as defined in Section 3.6) should be
3 observed. The minimum coupling loss (see 3.5) should not be exceeded. The change in radio conditions is achieved by
4 moving the UE along the drive route from source cell to target cell.

5 4.5.3 Test Procedure

6 In Non-Standalone Mode

7 1. The 4G and 5G cell setups are configured following Section 3.2.


8 2. All the three cells are configured according to the test configuration. The cells are activated and unloaded.
9 3. Both 5G cells are configured neighbors to each other, so that UE can trigger measurement events for handover.
10 4. The test UE is under source O-DU cell coverage.
11 5. Power on the UE and the UE shall successfully complete the LTE attach followed by successful SgNB
12 addition to source O-DU.
13 6. The full-buffer UDP data transmission (see 3.4) from the application server is initiated.
14 7. The UE shall move from source O-DU to target O-DU to perform handover.

15 In Standalone Mode

16 1. The 5G cell setup is configured following Section 3.2.


17 2. All the 5G cells are configured according to the test configuration. The cells are activated and unloaded.
18 3. Both 5G cells are configured neighbors to each other, so that UE can trigger measurement events for handover.
19 4. The test UE is under source O-DU cell coverage.
20 5. Power on the UE and the UE shall successfully register to source O-DU cell.
21 6. The full-buffer UDP data transmission (see 3.4) from the application server is initiated.
22 7. The UE shall move from source O-DU to target O-DU to perform handover.

23 4.5.4 Test requirements (expected results)

24 The inter O-DU handover call flow shall be verified for both NSA and SA use cases. Following functionalities shall
25 also be validated:

26 • PDU Session is established when full-buffer bi-directional data transmission is initiated. (Only in SA Mode)
27 • Handover is successful.

28 In addition to the common minimum set of configuration parameters defined (see Section 3.3), the following metrics
29 and counters shall be recorded and reported for the performance assessment.

30 eNB/gNB/Application server side:

31 • Transmit downlink throughput measured at application server in time (average per second)
32 • Received uplink throughput measured at application server in time (average per second)
33 • Uplink packet loss percentage during handover.
34 • Uplink BLER, MCS, MIMO rank (RI) on PUSCH in time (average per second)

35 UE side:

36 • Radio parameters such as RSRP, RSRQ, SINR on PDSCH in time (average per second)
37 • Downlink BLER, MCS, MIMO rank (RI) on PDSCH in time (average per second).

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 38
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Received Downlink throughput (L1 and L3 PDCP layers) in time (average per second).
2 • Downlink packet loss percentage during handover
3 • Uplink throughput (L1 and L3 PDCP layers) in time (average per second)
4 • Channel utilization, i.e. Number allocated/occupied downlink and uplink RBs in time (per TTI/average per
5 second) and Number of allocated/occupied slots in time.
6 • KPIs related to Handover failure, Call drop, Handover latency, Handover interruption time.

7 4.6 Inter-O-CU mobility

8 4.6.1 Test description and applicability

9 The purpose of the test is to verify inter O-CU handover of the UE. The test validates the O-CU, O-DU functionality in
10 handling inter O-CU mobility connected to same 5G Core Network (in SA) or Master eNB (in NSA) . The test
11 measures the DL / UL throughput variations, handover latency, handover interruption and packet loss during the
12 mobility. Test scenarios are classified into two groups as Standalone (SA) and Non-Standalone (NSA).

13 Inter O-CU mobility with SA- Xn based Handover call flow shall follow 3GPP TS 38.401 [31], Section 8.9.4 and
14 Section 8.9.5. Inter O-CU mobility with NSA shall follow 3GPP TS 37.340 [30], Section 10.5.1 for the call flow.

15 4.6.2 Test setup and configuration

16 In non-standalone mode, the test setup consists of a 4G cell (MeNB) and two 5G cells (SgNB), each 5G cell is
17 associated with a different O-DU and O-CU connected to the same 4G core network, refer to Figure 4-5 for the test
18 setup topology. The test environment shall have single UE with active data traffic. The application server should be
19 placed as close as possible to the core and connected to the core via a transport link with enough capacity.

20
21 Figure 4-5 Inter O-CU mobility test bed for NSA mode of operation

22 In standalone mode, The test setup consists of two 5G cells, each one associated with a different O-DU and O-CU
23 connected to same 5G Core network (see Figure 4-6 for the test setup topology). The test environment shall have a
24 single UE with active data traffic. The application server should be placed as close as possible to the core and connected
25 to the core via a transport link with enough capacity.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 39
O-RAN.TIFG.E2E-Test.0-v04.00

2 Figure 4-6 Inter O-CU mobility test bed for SA mode of operation.

4 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
5 recorded in the test report.

6 Laboratory setup: The radio conditions of UE can be modified by a variable attenuator/fading generator inserted
7 between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using a UE emulator.
8 The radio conditions of UE are initially set to excellent. The minimum coupling loss (see Section 3.6) should not be
9 exceeded. The UE should be placed inside RF shielded box or RF shielded room if the UE is not connected via cable.
10 The UE handover between the cells can be achieved by changing radio signal strength of source and target cells using
11 variable attenuators.
12

13 Field setup: The drive route with source and target cells should be defined. The UE is placed in the centre of cell close
14 to the radiated eNB/gNB antenna(s), where excellent radio conditions (RSRP as defined in Section 3.6) should be
15 observed. The minimum coupling loss (see 3.5) should not be exceeded. The change in radio conditions is achieved by
16 moving the UE along the drive route from source cell to target cell.

17

18 4.6.3 Test Procedure

19 In Non-Standalone Mode

20 1. The 4G and 5G cell setups are configured following Section 3.2.


21 2. Configure two 5G cells connected to different O-DU and O-CU according to the test configuration. The cells
22 are activated and unloaded.
23 3. 5G cells are configured as neighbors to 4G cell, so that UE can trigger measurement events for mobility.
24 4. The source cell (source O-DU and source O-CU) is the cell where UE is initially placed as depicted in Figure
25 4-5.
26 5. Power on the UE and the UE shall successfully complete the LTE attach followed by successful SgNB
27 addition to source 5G cell.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 40
O-RAN.TIFG.E2E-Test.0-v04.00

1 6. The full-buffer UDP bi-directional data transmission (see 3.4) from the application server is initiated.
2 7. The UE shall move from source 5G cell to target 5G cell.

3 In Standalone Mode

4 1. The 5G cell setup is configured following Section 3.2.


5 2. Configure two 5G cells connected to different O-DU and O-CU according to the test configuration. The cells
6 are activated and unloaded.
7 3. Both 5G cells are configured neighbors to each other, so that UE can trigger measurement events for handover.
8 4. The source cell (source O-DU and source O-CU) is the cell where UE is initially placed as depicted in Figure
9 4-6.
10 5. Power on the UE and UE shall successfully register to source 5G cell.
11 6. The full-buffer UDP bi-directional data transmission (see 3.4) from the application server is initiated.
12 7. The UE shall move from source cell to target cell to perform handover.

13 4.6.4 Test requirements (expected results)

14 The inter O-CU handover call flow shall be verified for NSA use case. Following functionalities shall also be validated:

15 • PDU Session is established when full-buffer bi-directional data transmission is initiated. (Only in SA Mode)
16 • Handover is successful.

17 In addition to the common minimum set of configuration parameters defined (see Section 3.3), the following metrics
18 and counters should be recorded and reported for the performance assessment

19 UE side:

20 • Radio parameters such as RSRP, RSRQ, SINR on PDSCH in time (average per second)
21 • Downlink BLER, MCS, MIMO rank (RI) on PDSCH in time (average per second).
22 • Received Downlink throughput (L1 and L3 PDCP layers) in time (average per second).
23 • Downlink packet loss percentage during handover
24 • Uplink throughput (L1 and L3 PDCP layers) in time (average per second)
25 • Channel utilization, i.e. Number allocated/occupied downlink and uplink RBs in time (per TTI/average per
26 second) and Number of allocated/occupied slots in time.
27 • KPIs related to Handover failure, Call drop, Handover latency, Handover interruption time.

28 eNB/gNB/Application server side:

29 • Transmit downlink throughput measured at application server in time (average per second)
30 • Received uplink throughput measured at application server in time (average per second)
31 • Uplink packet loss percentage during handover.
32 • Uplink BLER, MCS, MIMO rank (RI) on PUSCH in time (average per second)
33

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 41
O-RAN.TIFG.E2E-Test.0-v04.00

1 4.7 Registration and deregistration to a single network slice

2 4.7.1 Test description and applicability

3 The purpose of the test is to verify the full procedure of the registration and de-registration to the single eMBB network
4 slice in the 5G SA network.

5 The test focuses on the procedure of ‘Initial registration’ as defined in 3GPP 23.502 Section 4.2.2.2.2, with a single
6 eMBB network slice. The network slice information (i.e., Network Slice Selection Assistance Information, NSSAI)
7 defined in Table 4-7 shall be verified within the 3GPP standard registration procedure.

8 The test focuses on the procedure of ‘UE-initiated de-registration’ as defined in 3GPP 23.502 Section 4.2.2.3.2, with a
9 single eMBB network slice. Even though the network slice information is not included in the deregistration procedure,
10 the mandatory IEs defined in Table 4-7 shall be verified within the 3GPP standard deregistration procedure.

11 The granularity of slice awareness is in PDU session level, so this test also verifies the PDU session establishment and
12 release procedures.

13 The test measures not only the 3GPP standard registration/de-registration procedure but also the latency of the
14 procedure. The bi-directional data transmission shall be involved before the de-registration procedure to verify the
15 stability of the network slice.

16 4.7.2 Test setup and configuration

17 The network setup is single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.8) with
18 stationary UE (real or emulated UE) placed in the excellent radio conditions as defined in Section 3.7 – RSRQ should
19 be considered in case of downlink. Within the cell there should be only one active UE. The application server should be
20 placed as close as possible to the core/core emulator and connected to the core/core emulator via transport link with
21 enough capacity not limiting the expected data throughput. The UE, RAN, and 5G Core shall support the network
22 slicing, at least for one eMBB Single Network Slice Selection Assistance Information (S-NSSAI). The test is suitable
23 for lab as well as field environment.

24 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
25 recorded in the test report.

26 Laboratory setup: The radio conditions of UE can be modified by a variable attenuator/fading generator inserted between
27 the antenna connectors of O-RU and UE. The minimum attenuation of radio signal should be set to achieve the excellent
28 radio conditions (RSRQ as defined in Section 3.7), but the minimum coupling loss (see Section 3.7) should not be
29 exceeded. The UE should be placed inside RF shielded box or RF shielded room if the UE is not connected via cable.

30 Field setup: The UE is placed in the centre of cell close to the radiated gNB antenna(s), where excellent radio conditions
31 (RSRQ as defined in Section 3.7) should be observed. The minimum coupling loss (see 3.6) should not be exceeded.

32 Please refer Figure 5.2-1 for E2E test setup for 5G SA.

33 4.7.3 Test Procedure

34 Below are the test procedure steps

35 1. The NR cells shall be configured in the initial conditions defined in Session 3.2.

36 2. Only one eMBB S-NSSAI (i.e., Slice/Service Type (SST) = 0x01) under the NSSAI shall be configured in UE,
37 RAN, and 5G Core.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 42
O-RAN.TIFG.E2E-Test.0-v04.00

1 3. Power on the UE and UE shall send REGISTRATION REQUEST message. The only one requested S-NSSAI
2 shall be included in the message with the configured value in step 2. And UE shall successfully register to the
3 5G SA network.

4 4. The full-buffer UDP bi-directional data transmission (see Session 3.3) between UE and the application server
5 is initiated.

6 5. The registration procedure messages shall be captured to verify the mandatory IEs defined in Table 4-7, and
7 the latency of registration procedure shall be measured and recorded in Table 4-9. The duration of the test
8 should be at least 3 minutes when the throughput is stable. The PDU session establishment procedure messages
9 shall also be captured and verified.

10 6. Power off the UE and UE shall send DEREGISTRATION REQUEST message. And UE shall successfully de-
11 register from the 5G SA network.

12 7. The de-registration procedure messages shall be captured to verify the mandatory IEs defined in Table 4-8, and
13 the latency of de-registration procedure shall be measured and recorded in Table 4-9. The PDU session release
14 procedure messages shall also be captured and verified.

15 8. Repeat steps 3 to 7, 10 times and record the KPI mentioned in Table 4-9.

16 4.7.4 Test requirements (expected results)

17 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
18 should be captured and reported in the test report for the performance assessment.

19 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

20 • Mandatory IEs mentioned in Table 4-7, and Table 4-8

21 • Latency KPI mentioned in Table 4-9

22 Validate from collected logs registration (as per 3GPP 23.502 Section 4.2.2.2.2) and deregistration (as per 3GPP 23.502
23 Section 4.2.2.3.2) procedure and also validate ‘UE Requested PDU Session Establishment for Non-roaming and
24 Roaming with Local Breakout case’ as defined in 3GPP 23.502 Section 4.3.2.2.1, and the procedure of ‘PDU Session
25 Release for UE or network requested PDU Session Release for Non-Roaming and Roaming with Local Breakout case’
26 as defined in 3GPP 23.502 Section 4.3.4.2.

27 Table 4-7 defines the verification steps along with validation of mandatory IEs for 5G SA registration procedure in
28 single eMBB network slice.

29 Table 4-7 5G SA Registration verification with mandatory IEs in single eMBB network slice

Msg
St. Procedure Expected Output
Flow
RRCSetupComplete
UE → Verify that UE sends only one S-NSSAI in IE s-nssai-List and SST = ‘0x01’
1
gNB in the S-NSSAI to gNB.

UE →
AMF Verify that UE sends only one S-NSSAI in IE Requested NSSAI to 5G AMF,
2 Registration Request and IE 5GS registration type = ‘initial registration’, SST = ‘0x01’ in the S-
NSSAI.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 43
O-RAN.TIFG.E2E-Test.0-v04.00

Initial Context Setup Verify that 5G AMF sends only one S-NSSAI in IE Allowed NSSAI and S-
Request AMF →
3 NSSAI (under parent IE PDU Session Resource Setup Request Item) to gNB,
gNB
and SST = ‘0x01’ in the S-NSSAI.

Verify that 5G AMF sends only one S-NSSAI in IE Allowed NSSAI and
AMF →
4 Registration Accept Configured NSSAI to UE, SST = ‘0x01’ in the S-NSSAI and IE Rejected
UE
NSSAI shall not exist.
1

2 Table 4-8 defines the verification steps along with validation of mandatory IEs for 5G SA deregistration procedure in
3 single eMBB network slice.

4 Table 4-8 5G SA Deregistration verification with mandatory IEs in single eMBB network slice

Msg
St. Procedure Expected Output
Flow
De-registration AMF →
1 Verify that 5G AMF sends IE De-registration type = ‘Switch Off’ to UE.
Request UE
UE →
De-registration AMF Verify that UE sends IE De-registration accept message identity =
2
Accept ‘Deregistration accept (UE originating)’ to 5G AMF.

6 Table 4-9 defines the KPI record table of registration/ registration testing in single eMBB network slice.

7 Table 4-9 Single eMBB Network Slice KPI record table

KPI Repeat Times Calculation

1 2 3 4 5 6 7 8 9 10 Minimum Maximum Average

Registration
Time (single
slice)
(millisecond)

De-
registration
Time (single
slice)
(millisecond)

10 4.8 Registration and deregistration to multiple network slices

11 4.8.1 Test description and applicability

12 The purpose of the test is to verify the full procedure of the registration and de-registration to multiple network slices
13 (i.e., eMBB, URRLC, MIoT, V2X) in the 5G SA network.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 44
O-RAN.TIFG.E2E-Test.0-v04.00

1 The test focuses on the procedure of ‘Initial registration’ as defined in 3GPP 23.502 Section 4.2.2.2.2, with multiple
2 network slices. The network slice information (i.e., Network Slice Selection Assistance Information, NSSAI) defined in
3 Table 4-10 shall be verified within the 3GPP standard registration procedure.

4 The test focuses on the procedure of ‘UE-initiated de-registration’ as defined in 3GPP 23.502 Section 4.2.2.3.2, with
5 multiple network slices. Even though the network slice information is not included in the deregistration procedure, and
6 also the mandatory IEs defined in Table 4-11 shall be verified within the 3GPP standard deregistration procedure.

7 The granularity of slice awareness is in PDU session level, so this test also verifies the PDU session establishment and
8 release procedures.

9 The test measures not only the 3GPP standard registration/de-registration procedure but also the latency of the
10 procedure. The bi-directional data transmission shall be involved before the de-registration procedure to verify the
11 stability of the network slice.

12 4.8.2 Test setup and configuration

13 The network setup is single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.8) with
14 stationary UE (real or emulated UE) placed in the excellent radio conditions as defined in Section 3.7 – RSRQ should
15 be considered in case of downlink. Within the cell there should be only one active UE. The application server(s) should
16 be placed as close as possible to the core/core emulator and connected to the core/core emulator via transport link with
17 enough capacity not limiting the expected data throughput. The test is suitable for lab as well as field environment. The
18 UE, RAN, and 5G Core shall support the multiple Network Slice Selection Assistance Information (S-NSSAI). At least
19 two network slices of eMBB, URRLC, MIoT, V2X shall be covered for this test. The test is suitable for lab as well as
20 field environment.

21 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
22 recorded in the test report.

23 Laboratory setup: The radio conditions of UE can be modified by a variable attenuator/fading generator inserted between
24 the antenna connectors of O-RU and UE. The minimum attenuation of radio signal should be set to achieve the excellent
25 radio conditions (RSRQ as defined in Section 3.7), but the minimum coupling loss (see Section 3.7) should not be
26 exceeded. The UE should be placed inside RF shielded box or RF shielded room if the UE is not connected via cable.

27 Field setup: The UE is placed in the centre of cell close to the radiated eNB/gNB antenna(s), where excellent radio
28 conditions (RSRQ as defined in Section 3.7) should be observed. The minimum coupling loss (see 3.6) should not be
29 exceeded.

30 Please refer Figure 5.2-1 for E2E test setup for 5G SA.

31 4.8.3 Test Procedure


32 Below are the test procedure steps

33 1. The NR cells shall be configured in the initial conditions defined in Session 3.2.

34 2. Multiple S-NSSAI (i.e., Slice/Service Type (SST) = 0x01/0x02/0x03/0x04) under the NSSAI shall be
35 configured in UE, RAN, and 5G Core.

36 3. Power on the UE and UE shall send REGISTRATION REQUEST message. The multiple requested S-NSSAI
37 shall be included in the message with the configured value in step 2. And UE shall successfully register to the
38 5G SA network.

39 4. The full-buffer UDP bi-directional data transmission (see Session 3.3) between UE and the application server
40 is initiated in eMBB network slice if eMBB network slice is covered.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 45
O-RAN.TIFG.E2E-Test.0-v04.00

1 5. The full-buffer UDP bi-directional data transmission (see Session 3.3) between UE and the application server
2 is initiated in URRLC network slice if URRLC network slice is covered.

3 6. The full-buffer UDP bi-directional data transmission (see Session 3.3) between UE and the application server
4 is initiated in mIoT network slice if MIoT network slice is covered.

5 7. The full-buffer UDP bi-directional data transmission (see Session 3.3) between UE and the application server
6 is initiated in V2X network slice if V2X network slice is covered.

7 8. The registration procedure messages shall be captured to verify the mandatory IEs defined in Table 8.1, and
8 the latency of registration procedure shall be measured and recorded in Table 4-12. The duration of the test
9 should be at least 3 minutes when the throughput is stable. The PDU session establishment procedure messages
10 shall also be captured and verified for the network slices of eMBB/URRLC/MIoT/ V2X.

11 9. Power off the UE and UE shall send DEREGISTRATION REQUEST message. And UE shall successfully de-
12 register from the 5G SA network.

13 10. The de-registration procedure messages shall be captured to verify the mandatory IEs defined in Table 8.2, and
14 the latency of de-registration procedure shall be measured and recorded in Table 4-12. The PDU session
15 release procedure messages shall also be captured and verified for the network slices of eMBB/URRLC/MIoT/
16 V2X.

17 11. Repeat steps 3 to 10, 10 times and record the KPI mentioned in Table 8.3.

18 4.8.4 Test requirements (expected results)

19 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
20 should be captured and reported in the test report for the performance assessment.

21 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

22 • Mandatory IEs mentioned in Table 4-10, and Table 4-11

23 • Latency KPI mentioned in Table 4-12

24 Validate from collected logs registration (as per 3GPP 23.502 Section 4.2.2.2.2) and deregistration (as per 3GPP 23.502
25 Section 4.2.2.3.2) procedure and also validate ‘UE Requested PDU Session Establishment for Non-roaming and
26 Roaming with Local Breakout case’ as defined in 3GPP 23.502 Section 4.3.2.2.1 for multiple network slices of
27 eMBB/URRLC/MIoT/ V2X, and the procedure of ‘PDU Session Release for UE or network requested PDU Session
28 Release for Non-Roaming and Roaming with Local Breakout case’ as defined in 3GPP 23.502 Section 4.3.4.2 for
29 multiple network slices of eMBB/URRLC/MIoT/ V2X.

30 Table 4-10 defines the verification steps along with validation of mandatory IEs for 5G SA registration procedure in
31 multiple network slices.

32 Table 4-10 5G SA Registration verification with mandatory IEs in multiple network slices

Msg
St. Procedure Expected Output
Flow
RRCSetupComplete
UE → Verify that UE sends multiple S-NSSAI in IE s-nssai-List and SST =
1
gNB ‘0x01/0x02/0x03/0x04’ in the S-NSSAI to gNB.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 46
O-RAN.TIFG.E2E-Test.0-v04.00

UE →
AMF Verify that UE sends multiple S-NSSAI in IE Requested NSSAI to 5G AMF,
2 Registration Request and IE 5GS registration type = ‘initial registration’, SST =
‘‘0x01/0x02/0x03/0x04’ in the S-NSSAI.

Initial Context Setup Verify that 5G AMF sends multiple S-NSSAI in IE Allowed NSSAI and S-
Request AMF
3 NSSAI (under parent IE PDU Session Resource Setup Request Item) to gNB,
→ gNB
and SST = ‘0x01/0x02/0x03/0x04’ in the S-NSSAI.

AMF Verify that 5G AMF sends multiple S-NSSAI in IE Allowed NSSAI and
4 Registration Accept Configured NSSAI to UE, SST = ‘0x01/0x02/0x03/0x04’ in the S-NSSAI and
→ UE
IE Rejected NSSAI shall not exist.
1

2 Table 4-11 defines the verification steps along with validation of mandatory IEs for 5G SA deregistration procedure in
3 multiple network slices.

4 Table 4-11 5G SA Deregistration verification with mandatory IEs in multiple network slices

Msg
St. Procedure Expected Output
Flow
De-registration AMF →
1 Verify that 5G AMF sends IE De-registration type = ‘Switch Off’ to UE.
Request UE
UE →
De-registration AMF Verify that UE sends IE De-registration accept message identity =
2
Accept ‘Deregistration accept (UE originating)’ to 5G AMF.

6 Table 4-12 defines the KPI record table of registration/ registration testing in multiple network slices.

7 Table 4-12 Multiple Network Slices KPI record table

KPI Repeat Times Calculation

1 2 3 4 5 6 7 8 9 10 Minimum Maximum Average

Registration
Time
(multiple
slices)
(millisecond)

De-
registration
Time
(multiple
slices)
(millisecond)

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 47
O-RAN.TIFG.E2E-Test.0-v04.00

1 4.9 Idle Mode Intra-O-DU mobility

2 4.9.1 Test description and applicability

3 The purpose of the test is to verify O-CU, idle mode intra O-DU mobility of a UE. The test validates the O-CU, O-DU,
4 O-RU functionality in handling inter cell mobility when two O-RU connected to an O-DU. Test scenarios are classified
5 into two groups as Standalone (SA) Intra frequency cell reselection and as Standalone (SA) Inter frequency cell
6 reselection.

7 Idle mode Intra O-DU mobility shall follow 3GPP 38.133, Section 4.2.2.3 and Section 4.2.2.4 for intra frequency and
8 inter frequency cell selection measurement, respectively. And 3GPP 38.304, Section 5.2.2 for the state transition.

9 4.9.2 Test setup and configuration


10 In SA, the test setup consists of two 5G cells, each one associated with same O-DU and O-CU connected to 5G Core
11 network, refer Figure 4.9-1 for the test setup topology. The test environment shall have single UE in Idle mode.

12

13 Figure 4-7 Idle Mode Intra O-DU mobility test bed for SA mode of operation.

14 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
15 recorded in the test report.

16 Laboratory setup: The radio conditions of UE can be modified by a variable attenuator/fading generator inserted between
17 the antenna connectors of O-RU and UE. The radio conditions of UE are initially set to excellent. The minimum coupling
18 loss (see Section 3.6) should not be exceeded. The UE should be placed inside RF shielded box or RF shielded room if
19 the UE is not connected via cable. The UE mobility between the cells can be achieved by changing radio signal strength
20 of source and target cells using variable attenuators.

21 Field setup: The drive route with source and target cells should be defined. The UE is placed in the centre of cell close
22 to the radiated gNB antenna(s), where excellent radio conditions (RSRP as defined in Section 3.6) should be observed.
23 The minimum coupling loss (see 3.5) should not be exceeded. The change in radio conditions is achieved by moving the
24 UE along the drive route from source cell to target cell.

25 4.9.3 Test Procedure

26 Below are the SA Mode Intra frequency steps

27 1. The 5G cell setup is configured following section 3.2.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 48
O-RAN.TIFG.E2E-Test.0-v04.00

1 2. Configure two 5G cells (cell 1 and cell 2) of same frequency within an O-DU according to the test
2 configuration. The cells are activated and unloaded.

3 3. Both 5G Cells are configured neighbours to each other, so that UE can use it for cell re-selection.

4 4. The source and target 5G cells for intra O-DU mobility shall be depicted as in Figure 4-7.

5 5. The test UE is under source cell coverage.

6 6. Power on the UE and UE shall successfully register to source 5G cell.

7 7. Wait till the UE goes in idle mode as per UE inactivity timer and then move the UE from source cell
8 to target cell.

9 8. Once UE moves to new cell, make an MO data call.

10 9. Repeat the above test steps for 10 iterations.

11 Below is the SA Mode Inter frequency steps

12 1. The 5G cell setup is configured following section 3.2.

13 2. Configure two 5G cells (cell 1 and cell 2) on different frequencies within an O-DU according to the test
14 configuration. The cells are activated and unloaded.

15 3. Both 5G Cells are configured neighbours to each other, so that UE can use it for cell re-selection.

16 4. The source and target 5G cells for intra O-DU mobility shall be depicted as in Figure 4-7.

17 5. The test UE is under source cell coverage.

18 6. Power on the UE and UE shall successfully register to source 5G cell.

19 7. Wait till the UE goes in idle mode as per UE inactivity timer and then move the UE from source cell to target
20 cell, move the UE from source cell to target cell.

21 8. Once UE moves to new cell, make an MO data call.

22 9. Repeat the above test steps for 10 iterations.

23

24 4.9.4 Test requirements (expected results)

25 The Idle mode intra O-DU cell reselection shall be verified for both intra-frequency and inter-frequency. Also verify
26 call setup latency on a target cell with reference to RRC connection call flow defined in 3GPP 38.133, Section 5.3.3.1.
27 Following functionalities are also validated to declare the verdict.

28 • Cell re-selection is successful on target cell.


29 • RRC connection establishment is successful on a target cell.

30 In addition to the common minimum set of configuration parameters defined (see 3.3), the following metrics and
31 counters shall be recorded and reported for the performance assessment.

32 gNB side:

33 • Idle mode mobility time from UE, in idle mode at source to UE in idle mode to target.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 49
O-RAN.TIFG.E2E-Test.0-v04.00

1 UE side:

2 • Radio parameters such as RSRP, RSRQ.


3 • KPI’s related to Cell re-selection failure, idle mode mobility time from UE in idle mode at source to UE in idle
4 mode to target .
5
6 The cell re-selection failure can be found out by checking if the cell re-selection is successful or not. The Idle to
7 Connected on a target cell Time latency is measured by calculating the time between RRC_Idle mode to
8 RRC_Connected state when UE moves to new cell after initiating an MO call. Capture the cell re-selection
9 success/failure and latency for each iteration and sort the latency value observed for each iteration in ascending order.
10 Record the Minimum, Average (Sum of all latency value/ Total Iterations, Total Iterations are 10 in this case) and
11 Maximum latency value observed in below tables:

12 Table 4-13 5G Cell Re-selection Success and Failure KPI

KPI Repeat Times

1 2 3 4 5 6 7 8 9 10

Cell Re-selection
Success

Cell Re-selection
Failure

13

14
15

16 4.10 Idle mode Inter-O-DU mobility

17 4.10.1 Test description and applicability

18 The purpose of the test is to verify intra O-CU, idle mode inter O-DU mobility of a UE. The test validates the O-CU, O-
19 DU, O-RU functionality in handling inter cell mobility. Test scenarios are classified into two groups as Standalone (SA)
20 Intra frequency cell reselection and as Standalone (SA) Inter frequency cell reselection.

21 Idle mode Inter O-DU mobility shall follow 3GPP 38.133, Section 4.2.2.3 and Section 4.2.2.4 for intra frequency and
22 inter frequency cell selection measurement, respectively. And 3GPP 38.304, Section 5.2.2 for the state transition.

23 4.10.2 Test setup and configuration

24 In standalone Mode, the test setup consists of two 5G cells, each one associated with different O-DU, connected to
25 same O-CU. The test environment shall have single UE in Idle mode.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 50
O-RAN.TIFG.E2E-Test.0-v04.00

2 Figure 4-8 Inter O-DU mobility test bed for SA mode of operation

3 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
4 recorded in the test report.

5 Laboratory setup: The radio conditions of UE can be modified by a variable attenuator/fading generator inserted between
6 the antenna connectors of O-RU and UE. The radio conditions of UE are initially set to excellent. The minimum coupling
7 loss (see Section 3.6) should not be exceeded. The UE should be placed inside RF shielded box or RF shielded room if
8 the UE is not connected via cable. The UE mobility between the cells can be achieved by changing radio signal strength
9 of source and target cells using variable attenuators.

10 Field setup: The drive route with source and target cells should be defined. The UE is placed in the centre of cell close
11 to the radiated gNB antenna(s), where excellent radio conditions (RSRP as defined in Section 3.6) should be observed.
12 The minimum coupling loss (see 3.5) should not be exceeded. The change in radio conditions is achieved by moving the
13 UE along the drive route from source cell to target cell.

14 4.10.3 Test Procedure

15 Below are the SA Mode Intra frequency steps

16 1. The 5G cell setup is configured following section 3.2.

17 2. All the 5G cells are configured according to the test configuration. The cells are activated and unloaded

18 3. Both 5G Cells are configured neighbours to each other, so that UE can use it for cell re-selection.

19 4. The source and target 5G cells for inter O-DU mobility shall be depicted as in Figure 4-8.

20 5. The test UE is under source cell coverage.

21 6. Power on the UE and UE shall successfully register to source 5G cell.

22 7. Wait till the UE goes in idle mode as per UE inactivity timer and then move the UE from source cell to target
23 cell .

24 8. Once UE moves to new cell, make an MO data call.

25 9. Repeat the above test steps for 10 iterations.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 51
O-RAN.TIFG.E2E-Test.0-v04.00

1 Below is the SA Mode Inter frequency steps

2 1. The 5G cell setup is configured following section 3.2.

3 2. All the 5G cells are configured according to the test configuration. The cells are activated and unloaded.

4 3. Both 5G Cells are configured neighbours to each other, so that UE can use it for cell re-selection.

5 4. The source and target 5G cells for inter O-DU mobility shall be depicted as in Figure 4-8.

6 5. The test UE is under source cell coverage.

7 6. Power on the UE and UE shall successfully register to source 5G cell.

8 7. Wait till the UE goes in idle mode as per UE inactivity timer and then move the UE from source cell to target
9 cell.

10 8. Once UE moves to new cell, make an MO data call.

11 9. Repeat the above test steps for 10 iterations.

12

13 4.10.4 Test requirements (expected results)

14 The Idle mode inter O-DU cell reselection shall be verified for both intra-frequency and inter-frequency. Also verify
15 call setup latency on a target cell with reference to RRC connection call flow mentioned in 3GPP 38.133, Section
16 5.3.3.1. Following functionalities are also validated to declare the verdict.

17 • RRC connection establishment is complete.


18 • Cell re-selection is successful.

19 In addition to the common minimum set of configuration parameters defined (see 3.3), the following metrics and
20 counters shall be recorded and reported for the performance assessment.

21 gNB side:

22 • Idle mode mobility time from UE, in idle mode at source to UE in idle mode to target.

23 UE side:

24 • Radio parameters such as RSRP, RSRQ, SINR on PDSCH in time (average per second)
25 • KPI’s related to Cell re-selection failure, Idle mode mobility time from UE, in idle mode at source to UE in
26 idle mode to target..
27

28 The cell re-selection failure can be found out by checking if the cell re-selection is successful or not. The Idle to
29 Connected on a target cell Time latency is measured by calculating the time between RRC_Idle mode to
30 RRC_Connected state when UE moves to new cell after initiating an MO call. Capture the cell re-selection
31 success/failure and latency for each iteration and sort the latency value observed for each iteration in ascending order.
32 Record the Minimum, Average (Sum of all latency value/ Total Iterations, Total Iterations are 10 in this case) and
33 Maximum latency value observed in below tables:

34 Table 4-14 5G Cell Re-selection Success and Failure KPI

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 52
O-RAN.TIFG.E2E-Test.0-v04.00

KPI Repeat Times

1 2 3 4 5 6 7 8 9 10

Cell Re-selection
Success

Cell Re-selection
Failure

1 4.11 Idle mode Inter-O-CU mobility

2 4.11.1 Test description and applicability

3 The purpose of the test is to verify inter O-CU mobility of the UE. The test validates the O-CU, O-DU functionality in
4 handling inter O-CU mobility connected to same 5G Core Network (in SA). The test validates the O-CU, O-DU, O-RU
5 functionality in handling inter cell mobility. Test scenarios are classified into two groups as Standalone (SA) Intra
6 frequency cell reselection and as Standalone (SA) Inter frequency cell reselection.

7 Idle mode Inter O-CU mobility shall follow 3GPP 38.133, Section 4.2.2.3 and Section 4.2.2.4 for intra frequency and
8 inter frequency cell selection measurement, respectively. And 3GPP 38.304, Section 5.2.2 for the state transition.

9 4.11.2 Test setup and configuration

10 In Standalone mode, the test setup consists of two 5G cells, each one associated with a different O-DU and O-CU
11 connected to same 5G Core network, refer Figure 4.9-2 for the test setup topology. The test environment shall have
12 single UE.

13

14 Figure 4-9 Inter O-CU mobility test bed for SA mode of operation

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 53
O-RAN.TIFG.E2E-Test.0-v04.00

1 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
2 recorded in the test report.

3 Laboratory setup: The radio conditions of UE can be modified by a variable attenuator/fading generator inserted between
4 the antenna connectors of O-RU and UE. The radio conditions of UE are initially set to excellent. The minimum coupling
5 loss (see Section 3.6) should not be exceeded. The UE should be placed inside RF shielded box or RF shielded room if
6 the UE is not connected via cable. The UE handover between the cells can be achieved by changing radio signal strength
7 of source and target cells using variable attenuators.

8 Field setup: The drive route with source and target cells should be defined. The UE is placed in the centre of cell close
9 to the radiated eNB/gNB antenna(s), where excellent radio conditions (RSRP as defined in Section 3.6) should be
10 observed. The minimum coupling loss (see 3.5) should not be exceeded. The change in radio conditions is achieved by
11 moving the UE along the drive route from source cell to target cell.

12

13 4.11.3 Test Procedure


14 Below are the SA Mode Intra frequency steps

15 1. The 5G cell setup is configured following section 3.2.


16 2. Configure two 5G cells (cell 1 and cell 2) connected to different O-DU and O-CU according to the test
17 configuration. The cells are activated and unloaded.
18 3. Both 5G Cells are configured neighbors to each other, so that UE can use it for cell re-selection.
19 4. The source cell (source O-DU and source O-CU) is the cell where UE is initially placed as depicted in Figure
20 4-9.
21 5. Power on the UE and UE shall successfully register to source 5G cell.

22 6. Wait till the UE goes in idle mode as per UE inactivity timer and then move the UE from source cell to target
23 cell.

24 7. Once UE moves to new cell, make an MO data call.

25 8. Repeat the above test steps for 10 iterations.

26

27 Below are the SA Mode Inter frequency steps

28 1. The 5G cell setup is configured following section 3.2.


29 2. Configure two 5G cells (cell 1 and cell 2) connected to different O-DU and O-CU according to the test
30 configuration. The cells are activated and unloaded.
31 3. Both 5G Cells are configured neighbors to each other, so that UE can use it for cell re-selection.
32 4. The source cell (source O-DU and source O-CU) is the cell where UE is initially placed as depicted in Figure
33 4-9.
34 5. Power on the UE and UE shall successfully register to source 5G cell.

35 6. Wait till the UE goes in idle mode as per UE inactivity timer and then move the UE from source cell to target
36 cell.

37 7. Once UE moves to new cell, make an MO data call.

38 8. Repeat the above test steps for 10 iterations.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 54
O-RAN.TIFG.E2E-Test.0-v04.00

1 4.11.4 Test requirements (expected results)

2 The Idle mode inter O-CU cell reselection shall be verified for both intra-frequency and inter-frequency. Also verify
3 call setup latency on a target cell with reference to RRC connection call flow mentioned in 3GPP 38.133, Section
4 5.3.3.1. Following functionalities are also validated to declare the verdict.

5 • RRC connection establishment is complete.


6 • Cell re-selection is successful.

7 In addition to the common minimum set of configuration parameters defined (see 3.3), the following metrics and
8 counters shall be recorded and reported for the performance assessment.

9 gNB side:

10 • Idle mode mobility time from UE, in idle mode at source to UE in idle mode to target.

11 UE side:

12 • Radio parameters such as RSRP, RSRQ, SINR on PDSCH in time (average per second)
13 • KPI’s related to Cell re-selection failure, Idle mode mobility time from UE, in idle mode at source to UE in
14 idle mode to target..
15
16 The cell re-selection failure can be found out by checking if the cell re-selection is successful or not. The Idle to
17 Connected on a target cell Time latency is measured by calculating the time between RRC_Idle mode to
18 RRC_Connected state when UE moves to new cell after initiating an MO call. Capture the cell re-selection
19 success/failure and latency for each iteration and sort the latency value observed for each iteration in ascending order.
20 Record the Minimum, Average (Sum of all latency value/ Total Iterations, Total Iterations are 10 in this case) and
21 Maximum latency value observed in below tables:

22 Table 4-15 5G Cell Re-selection Success and Failure KPI

KPI Repeat Times

1 2 3 4 5 6 7 8 9 10

Cell Re-selection
Success

Cell Re-selection
Failure

23 4.12 5G/4G Inter-RAT Mobility - 5G to LTE handover

24 4.12.1 Test description and applicability

25 The purpose of this test is to validate inter system handovers between 5G and LTE systems. The test validates the O-
26 CU, O-DU, and O-RU functionality in handling Inter RAT mobility from 5G to LTE. This scenarios covers handover
27 of UE from 5G to LTE when UE is registered in 5G and moves from 5G coverage area to LTE coverage area. This
28 scenario is applicable for LTE and 5G SA

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 55
O-RAN.TIFG.E2E-Test.0-v04.00

1 The test focuses on the procedure of ‘5GS to EPS handover using N26 interface’ as defined in 3GPP TS 23.502 Section
2 4.2.2.2.1 and ‘Inter-RAT mobility as defined in 3GPP TS 28.331 Section 5.4

3 4.12.2 Test setup and configuration

4 The test setup shall include 5G End to End system (gNB and 5G Core) and LTE E2E system (O-eNB and EPC). 5G and
5 4G core shall be Interconnected for Inter RAT mobility (N26 interface between the MME and AMF) and shall supports
6 a combined anchor point for 4G and 5G, i.e. SMF+PGW-C and UPF+PGW-U. The eNB connects to a 4G-5G
7 interconnected core over the 4G interfaces over S1 to provide 4G LTE service to make E2E LTE system and the O-CU-
8 CP/O-CU-UP, O-DU and O-RU would connect to a 4G-5G interconnected core over the 5G interfaces over N2 and N3
9 to make E2E 5G SA system . The DUT (gNB and O-eNB) and the components shall comply to the O-RAN
10 specifications. We will need Real or emulated UE . The test setup (UE, gNB, O-eNB, 5G and 4G Core) should include
11 tools which have the ability to collect traces on the elements and/or packet captures of communication between the
12 elements. Optionally, if some of the network elements are located remotely either in a cloud or on the internet, the
13 additional latency should be calculated and accounted for. The O-eNB, gNB and their components (O-RU, O-DU and
14 O-CU-CP/O-CU-UP) need to have the right configuration and software load.

15 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
16 recorded in the test report.

17 Laboratory setup: The radio conditions of UE can be modified by a variable attenuator/fading generator inserted
18 between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using a UE emulator.
19 The radio conditions of the UE are initially set to excellent using RSRP as the metric. The minimum coupling loss (see
20 Section 3.6) should not be exceeded. The UE should be placed inside and RF shielded box or RF shielded room if the
21 UE is not connected via cable. The UE handover between the cells can be achieved by changing the radio signal
22 strength of the source (gNB) and target cells (O-eNB) using variable attenuators.
23

24 Field setup: The drive route with source and target cells should be defined. The UE is placed in the centre of cell close
25 to the radiated eNB/gNB antenna(s), where excellent radio conditions (using RSRP as the metric as defined in Section
26 3.6) should be observed. The minimum coupling loss (see 3.5) should not be exceeded. The change in radio conditions is
27 achieved by moving the UE along the drive route from source (gNB) and target cells (O-eNB)

28 4.12.3 Test Procedure

29 Below are the 5G to LTE Inter RAT handover steps

30 1. The test setup is configured according to the test configuration. The test configuration should be recorded in
31 the test report. The serving cell under test is activated and unloaded. All other cells are powered off.

32 2. 5G and LTE cells are configured as neighbours to each other, so that the UE can trigger measurement events
33 for Inter RAT handover.

34 3. The test UE is under source cell (gNB) coverage.

35 4. Power on the UE and UE shall successfully register to source 5G cell.

36 5. The full-buffer UDP bi-directional data transmission (see 3.4) from the application server is initiated.

37 6. The UE shall move from source (gNB) and target cells (O-eNB) to perform handover.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 56
O-RAN.TIFG.E2E-Test.0-v04.00

1 4.12.4 Test requirements (expected results)

2 The Inter RAT handover call flow shall be verified between 5G and LTE systems. Following functionalities shall also
3 be validated:

4 • PDU Session is established when full-buffer bi-directional data transmission is initiated with 5G source cell
5 • Handover is successful.

6 In addition to the common minimum set of configuration parameters defined (see Section 3.3), the following metrics
7 and counters shall be recorded and reported for the performance assessment.

8 eNB/gNB/Application server side:

9 • Transmit downlink throughput measured at application server in time (average per second)
10 • Received uplink throughput measured at application server in time (average per second)
11 • Uplink packet loss percentage during handover.
12 • Uplink BLER, MCS, MIMO rank (RI) on PUSCH in time (average per second)

13 UE side:

14 • Radio parameters such as RSRP, RSRQ, SINR on PDSCH in time (average per second)
15 • Downlink BLER, MCS, MIMO rank (RI) on PDSCH in time (average per second).
16 • Received Downlink throughput (L1 and L3 PDCP layers) in time (average per second).
17 • Downlink packet loss percentage during handover
18 • Uplink throughput (L1 and L3 PDCP layers) in time (average per second)
19 • Channel utilization, i.e. Number allocated/occupied downlink and uplink RBs in time (per TTI/average per
20 second) and Number of allocated/occupied slots in time.
21 • KPIs related to Handover failure, Call drop, Handover latency, Handover interruption time.
22

23 4.13 5G/4G Inter-RAT Mobility - LTE to 5G handover

24 4.13.1 Test description and applicability

25 The purpose of this test is to validate inter system handovers between 5G and LTE systems. The test validates the O-
26 CU, O-DU, and O-RU functionality in handling Inter RAT mobility from LTE to 5G. This scenarios covers handover of
27 UE from 5G to LTE when UE is registered in LTE and moves from LTE coverage area to 5G coverage area. This
28 scenario is applicable for LTE and 5G SA

29 The test focuses on the procedure of ‘EPS to 5GS handover using N26 interface’ as defined in 3GPP TS 23.502 Section
30 4.2.2.2.2 and ‘Inter-RAT mobility as defined in 3GPP TS 28.331 Section 5.4

31 4.13.2 Test setup and configuration

32 The test setup shall include 5G End to End system (gNB and 5G Core) and LTE E2E system (O-eNB and EPC). 5G and
33 4G core shall be Interconnected for Inter RAT mobility (N26 interface between the MME and AMF) and shall supports
34 a combined anchor point for 4G and 5G, i.e. SMF+PGW-C and UPF+PGW-U. The eNB connects to a 4G-5G
35 interconnected core over the 4G interfaces over S1 to provide 4G LTE service to make E2E LTE system and the O-CU-

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 57
O-RAN.TIFG.E2E-Test.0-v04.00

1 CP/O-CU-UP, O-DU and O-RU would connect to a 4G-5G interconnected core over the 5G interfaces over N2 and N3
2 to make E2E 5G SA system . The DUT (gNB and O-eNB) and the components shall comply to the O-RAN
3 specifications. We will need Real or emulated UE . The test setup (UE, gNB, O-eNB, 5G and 4G Core) should include
4 tools which have the ability to collect traces on the elements and/or packet captures of communication between the
5 elements. Optionally, if some of the network elements are located remotely either in a cloud or on the internet, the
6 additional latency should be calculated and accounted for. The O-eNB, gNB and their components (O-RU, O-DU and
7 O-CU-CP/O-CU-UP) need to have the right configuration and software load.

8 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
9 recorded in the test report.

10 Laboratory setup: The radio conditions of UE can be modified by a variable attenuator/fading generator inserted
11 between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using a UE emulator.
12 The radio conditions of the UE are initially set to excellent using RSRP as the metric. The minimum coupling loss (see
13 Section 3.6) should not be exceeded. The UE should be placed inside and RF shielded box or RF shielded room if the
14 UE is not connected via cable. The UE handover between the cells can be achieved by changing the radio signal
15 strength of the source (O-eNB) and target cells (gNB) using variable attenuators.
16

17 Field setup: The drive route with source and target cells should be defined. The UE is placed in the centre of cell close
18 to the radiated eNB/gNB antenna(s), where excellent radio conditions (using RSRP as the metric as defined in Section
19 3.6) should be observed. The minimum coupling loss (see 3.5) should not be exceeded. The change in radio conditions is
20 achieved by moving the UE along the drive route from source (O-NB) and target cells (gNB)

21 4.13.3 Test Procedure

22 Below are the LTE to 5G Inter RAT handover steps

23 1. The test setup is configured according to the test configuration. The test configuration should be recorded in
24 the test report. The serving cell under test is activated and unloaded. All other cells are powered off.

25 2. LTE and 5G cells are configured as neighbours to each other, so that the UE can trigger measurement events
26 for Inter RAT handover.

27 3. The test UE is under source cell (gNB) coverage.

28 4. Power on the UE and UE shall successfully register to source LTE cell.

29 5. The full-buffer UDP bi-directional data transmission (see 3.4) from the application server is initiated.

30 6. The UE shall move from source (LTE) and target cells (gNB) to perform handover.

31 4.13.4 Test requirements (expected results)

32 The Inter RAT handover call flow shall be verified between LTE and 5G systems. Following functionalities shall also
33 be validated:

34 • PDU Session is established when full-buffer bi-directional data transmission is initiated with LTE source cell
35 • Handover is successful.

36 In addition to the common minimum set of configuration parameters defined (see Section 3.3), the following metrics
37 and counters shall be recorded and reported for the performance assessment.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 58
O-RAN.TIFG.E2E-Test.0-v04.00

1 eNB/gNB/Application server side:

2 • Transmit downlink throughput measured at application server in time (average per second)
3 • Received uplink throughput measured at application server in time (average per second)
4 • Uplink packet loss percentage during handover.
5 • Uplink BLER, MCS, MIMO rank (RI) on PUSCH in time (average per second)

6 UE side:

7 • Radio parameters such as RSRP, RSRQ, SINR on PDSCH in time (average per second)
8 • Downlink BLER, MCS, MIMO rank (RI) on PDSCH in time (average per second).
9 • Received Downlink throughput (L1 and L3 PDCP layers) in time (average per second).
10 • Downlink packet loss percentage during handover
11 • Uplink throughput (L1 and L3 PDCP layers) in time (average per second)
12 • Channel utilization, i.e. Number allocated/occupied downlink and uplink RBs in time (per TTI/average per
13 second) and Number of allocated/occupied slots in time.
14 KPIs related to Handover failure, Call drop, Handover latency, Handover interruption time.

15

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 59
O-RAN.TIFG.E2E-Test.0-v04.00

1 Performance tests
2 This chapter describes the tests evaluating and assessing the performance of radio access network from network end-
3 to-end perspective (see Section 3.1). The focus of the testing is on the end-user performance which is compared against
4 the target and expected performance values. The pass and fail thresholds are defined for the test wherever possible.

5 The general test methodologies and configurations are mentioned in Chapter 3.

6 Unless otherwise stated in the chapter, the tests are suitable and can be performed in both laboratory as well as field
7 testing environments, with pros and cons for each environment. The specific lab and field test setups are mentioned in
8 each test.

9 The following end-to-end performance tests are defined in this chapter as an extension of NGMN testing framework [3]:

10 Table 5-1: E2E Performance Test Case summary

Applicable technology
Functional

NSA
LTE

SA
Test case
group
Test
E2E Performance Assessment
ID
5.2 Downlink peak throughput Integrity Y Y Y
5.3. Uplink peak throughput Integrity Y Y Y
5.4 Downlink throughput in different radio conditions
Integrity Y Y Y
5.5 Uplink throughput in different radio conditions
Integrity Y Y Y
5.6 Bidirectional throughput in different radio conditions
Integrity Y Y Y
5.7 Downlink coverage throughput (link budget)
Integrity Y Y Y
5.8 Uplink coverage throughput (link budget)
Integrity Y Y Y
5.9 Downlink aggregated cell throughput (cell capacity)
Integrity Y Y Y
5.10 Uplink aggregated cell throughput (cell capacity)
Integrity Y Y Y
5.11 Impact of fronthaul latency on downlink peak throughout
Integrity Y Y Y
5.12 Impact of fronthaul latency on uplink peak throughout
Integrity Y Y Y
5.13 Impact of midhaul latency on downlink peak throughout
Integrity N/A Y Y
5.14 Impact of midhaul latency on uplink peak throughout
Integrity N/A Y Y

11

12 Future versions of this specification may add additional end-to-end performance tests not currently addressed in this
13 version, for example:

14 • Downlink throughput with inter-cell interferences (multi-cell, single UE scenario)

15 • Uplink throughput with inter-cell interferences (multi-cell, single UE scenario)

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 60
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Downlink drive throughput (multi-cell, single UE scenario)

2 • Uplink drive throughput (multi-cell, single UE scenario)

3 • End-to-end latency (single cell scenario, single UE scenario)

4 • Impact of fronthaul compression scheme on downlink peak throughput (single cell, single UE scenario)

5 • Impact of fronthaul compression scheme on uplink peak throughput (single cell, single UE scenario)

6 • Resource scheduling (single cell, multi-UE scenario)

7 5.1 Expected throughput calculation


8 This chapter provides methodology to calculate expected theoretical downlink and uplink data throughput for 4G LTE
9 as well 5G NR and for both FDD and TDD duplex modes.

10 5.1.1 4G LTE
11 The 4G LTE expected theoretical downlink or uplink throughput at the physical layer can be calculated using the
12 following formula:

8 12,-
'( - - 𝑁/01 · 12
13 𝐿1 𝑡ℎ𝑟𝑜𝑢𝑔ℎ𝑝𝑢𝑡 [𝑀𝑏𝑝𝑠] = 10 · F G𝑣$)*+, · 𝑄. ·𝑅· · (1 − 𝑂𝐻 - ) · 𝐷𝐿𝑈𝐿,)567 R
𝑇4
-9:

14 where

15 • J is the number of aggregated LTE component carriers.

16 • 𝑅 is the coding rate corresponding to channel quality (CQI and SINR). The maximum coding rate is 0.9258.

17 For the j-th component carrier:

18 • 𝑣$)*+,4 is the number of MIMO layers. The maximum is 4 (8 in LTE-Advanced) in downlink and 1 (4 in LTE
19 -Advanced) in uplink.

20 • 𝑄. is the modulation order, which is equal to 2 for QPSK, 4 for 16QAM, 5 for 32QAM, 6 for 64QAM, 8 for
21 256QAM.
12
22 • 𝑁/01 is the number of PRBs allocated in bandwidth BW – see Table 5-2.

23 Table 5-2 The number of PRBs allocated in bandwidth

Bandwidth [MHz] 1.4 3 5 10 15 20


Number of PRBs 6 13 25 50 75 100
24 • OH is overhead for control channels and signalling (i.e. Reference Signal, PSS, SSS, PBCH, PDCCH, etc. in
25 downlink or SRS, PUCCH, PRACH in uplink) within a period of 1 sec – see Table 5-3 for downlink.

26 Table 5-3 Overhead as a function of bandwidth, no. of antenna ports and CFI

Bandwidth 1.4 MHz 3 MHz 5 MHz 10 MHz 15 MHz 20 MHz


CFI 1 2 1 2 1 2 1 2 1 2 1 2
1 TX (ant. ports) 0.16 0.21 0.14 0.18 0.12 0.17 0.12 0.16 0.11 0.16 0.11 0.16
2 TX (ant. ports) 0.20 0.24 0.17 0.22 0.16 0.21 0.15 0.20 0.15 0.20 0.15 0.20

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 61
O-RAN.TIFG.E2E-Test.0-v04.00

4 TX (ant. ports) 0.25 0.29 0.22 0.26 0.21 0.25 0.20 0.25 0.20 0.25 0.20 0.24
';
1 • 𝑇4 is the average OFDM symbol duration in a subframe, and it is equal to 𝑇4 = 10 /14 for normal Cyclic
2 Prefix (14 OFDM symbols per slot) or 𝑇4 = 10'; /12 for extended CP (12 OFDM symbols per slot).

3 • 𝐷𝐿𝑈𝐿,)567 is a ratio of the symbols allocated for DL or UL data to total number of symbols in a frame (10ms)
4 – predefined patters (uplink-downlink configuration, special subframe configuration) for DL/UL allocation in
5 [25]. In case of FDD mode, the DLULratio is equal to 1.

6 Example of calculation of expected downlink throughput for the following system configuration:

7 • MIMO 2x2 → 𝑣$)*+,4 = 2


12
8 • BW = 20 MHz → 𝑁/01 =100
9 • Normal CP
10 • CFI = 1 → OH = 0.15
11 • no carrier aggregation → J = 1
12 • MCS = 28 and modulation 64QAM → 𝑄. = 6 and 𝑅 = 0.9258
13 • TDD with uplink-downlink configuration = 1 (i.e. DSUUD) and special subframe configuration = 7 (i.e.
14 10:2:2) → 𝐷𝐿𝑈𝐿,)567 = (14 · 2 + 10)/14 · 5) = 0.543
:
'(
100 · 12
15 𝐿1 𝑡ℎ𝑟𝑜𝑢𝑔ℎ𝑝𝑢𝑡 [𝑀𝑏𝑝𝑠] = 10 · F \2 · 6 · 0.9258 · · (1 − 0.15) · 0.543] = 86.1 𝑀𝑏𝑝𝑠
10';
-9:
14
16

17 5.1.2 5G NR

18 The 5G NR expected theoretical downlink or uplink throughput at physical layer (UE category is not assumed) can be
19 calculated using the following formula [15]:

8 12-,<
'( - - 𝑁/01 · 12
20 𝐿1 𝑡ℎ𝑟𝑜𝑢𝑔ℎ𝑝𝑢𝑡 [𝑀𝑏𝑝𝑠] = 10 · F G𝑣$)*+,4 · 𝑄. · 𝑓- · 𝑅 · < · (1 − 𝑂𝐻 - ) · 𝐷𝐿𝑈𝐿,)567 R
𝑇=
-9:

21 where

22 • J is the total number of component carriers (CC) in a band or band combination. The maximum number is 16
23 [23].

24 • 𝑅 is the coding rate corresponding to channel quality (CQI, SINR), and it is calculated as Target_code_rate
25 [22]/1024. For LDPC code the maximum coding rate is 948/1024.

26 For the j-th component carrier:

27 • 𝑣$)*+,4 is the number of MIMO layers. The maximum is 8 in downlink and 4 in uplink [23].

28 • 𝑄. is the modulation order, which is equal to 2 for QPSK, 4 for 16QAM, 5 for 32QAM, 6 for 64QAM, 8 for
29 256QAM.

30 • 𝑓 is the scaling factor [24] which can take the values 0.4, 0.75, 0.8 or 1. The scaling factor is signalled per
31 band and per band combination as per UE capability signalling.

32 • 𝑂𝐻 is the overhead of control channels and signalling within a period of 1 sec, and it is equal to

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 62
O-RAN.TIFG.E2E-Test.0-v04.00

1 o 0.14 for frequency range 1 (FR1) for DL

2 o 0.18 for frequency range 2 (FR2) for DL

3 o 0.08 for frequency range 1 (FR1) for UL

4 o 0.10 for frequency range 2 (FR2) for UL

5 • µ is 5G NR numerology (sub-carrier spacing) which can take the values from 0 to 4 [20]. The number of slots
6 per frame and sub-carrier spacing vary with numerology – see Table 5-4.

8 Table 5-4 5G NR numerology

5G NR Sub-carrier Number of slots per frame Number of symbols Total number of


numerology µ spacing (SCS) per slot symbols per frame
0 15 kHz 10 14 (normal CP) 140
1 30 kHz 20 14 (normal CP) 280
60 kHz 40 14 (normal CP) 560
2
60 kHz 40 12 (extended CP) 480
3 120 kHz 80 14 (normal CP) 1120
4 240 kHz 160 14 (normal CP) 2240
9 • 𝑇=<is the average OFDM symbol duration in a subframe for numerology µ, and it is equal to 𝑇=< = 10'; /(14 ·
10 2 ) for normal Cyclic Prefix (14 OFDM symbols per slot) or 𝑇=< = 10'; /(12 · 2> ) for extended CP (12 OFDM
>

11 symbols per slot).

12,>
12 • 𝑁/01 is the number of PRBs allocated in bandwidth BW with numerology µ - see Table 5-5.

13 Table 5-5 The max. number of PRBs for each supported bandwidth and 5G NR numerology µ [18], [19]

Channel bandwidth BW [MHz]


µ 5 10 15 20 25 30 40 50 60 80 90 100 200 400
0 25 52 79 106 133 160 216 270 N/A N/A N/A N/A N/A N/A
1 11 24 38 51 65 78 106 133 162 217 245 273 N/A N/A
2 N/A 11 18 24 31 38 51 65 79 107 121 135 N/A N/A
3 N/A N/A N/A N/A N/A N/A N/A 66 N/A N/A N/A 132 264 N/A
4 N/A N/A N/A N/A N/A N/A N/A 32 N/A N/A N/A 66 132 264
14 • 𝐷𝐿𝑈𝐿,)567 is a ratio of the symbols allocated for DL or UL data to total number of symbols in a time period
15 (periodicity) in TDD mode – predefined slot formats in [21]. In case of FDD mode, the DLULratio is equal to 1.

16 Example of calculation of expected downlink throughput for the following system configuration:

17 • SCS = 15kHz → µ = 0
18 • MIMO 2x2 → 𝑣$)*+,4 = 2

12,>
19 • BW = 20 MHz → 𝑁/01 =106

20 • Normal CP
21 • no carrier aggregation → J = 1

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 63
O-RAN.TIFG.E2E-Test.0-v04.00

1 • MCS = 28 and modulation 64QAM → 𝑄. = 6 and Target code rate = 948


2 • scaling factor f = 1
3 • downlink throughput and FR1 → OH = 0.14
4 • TDD configuration 4:1 (DDDDSU), i.e. four DL slots (slot format 0 [21]), one Special slot (slot format 32
5 [21]) and one UL slot (slot format 1 [21]) where slot format 0 means allocation of all 14 symbols for DL, slot
6 format 1 means allocation of all 14 symbols for UL, slot format 32 means allocation of 10 symbols for DL, 2
7 symbols for Guard Period, 2 symbols for UL → 𝐷𝐿𝑈𝐿,)567 = (14 · 4 + 10)/14 · 6) = 0.7857
:
106 · 12
8 𝐿1 𝑡ℎ𝑟𝑜𝑢𝑔ℎ𝑝𝑢𝑡 [𝑀𝑏𝑝𝑠] = 10'( · F \2 · 6 · 1 · 948/1024 · · (1 − 0.14) · 0.7857] = 133.6 𝑀𝑏𝑝𝑠
10';
-9:
14 · 2?
9

10 5.2 Downlink peak throughput

11 5.2.1 Test description and applicability

12 The purpose of the test is to measure the peak (i.e. maximum achievable) user data throughput in the downlink direction
13 (i.e. data transmitted from application (traffic) server to UE). A stationary UE is placed under excellent radio conditions
14 inside an isolated cell.

15 5.2.2 Test setup and configuration

16 The test setup is a single cell scenario (i.e. an isolated cell without any inter-cell interference – see Section 3.7) with a
17 stationary UE (real or emulated UE) placed under excellent radio conditions as defined in Section 3.6 – using SINR as
18 the metric since this is a downlink test. Note that in this case of a single cell scenario, SINR is in fact SNR as inter-cell
19 interference is not present. Within the cell there should be only one active UE downloading data from the application
20 server. The application server should be placed as close as possible to the core/core emulator and connected to the
21 core/core emulator via a transport link with sufficient capacity so as not to limit the expected data throughput. The test is
22 suitable for both lab and field environments.

23 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
24 recorded in the test report.

25 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator/fading
26 generator inserted between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using
27 a UE emulator. The test environment should be setup to achieve excellent radio conditions (SINR as defined in Section
28 3.6) for the UE, but the minimum coupling loss (see Section 3.6) should not be exceeded. The UE should be placed inside
29 an RF shielded box or RF shielded room if the UE is not connected via cable.

30 Field setup: The UE is placed in the centre of cell close to the radiated eNB/gNB antenna(s), where excellent radio
31 conditions (SINR as defined in Section 3.6) should be observed. The minimum coupling loss (see Section 3.6) should not
32 be exceeded.

Application server
Uu S1 3GPP Core 3GPP Other
UE 4G eNB
4G EPC services services
O-RAN System under test

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 64
O-RAN.TIFG.E2E-Test.0-v04.00

Uu 4G eNB S1
(anchor) Application server
3GPP Core 3GPP Other
UE X2
4G EPC EN-DC NSA services services
Uu S1
5G en-gNB

O-RAN System under test

Application server
Uu NG 3GPP Core 3GPP Other
UE 5G gNB
5GC SA services services
O-RAN System under test

1 Figure 5-1 The test setups of 4G, 5G NSA and 5G SA

2 5.2.3 Test Procedure


3 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
4 report. The serving cell under test is activated and unloaded. All other cells are turned off.

5 2. The UE (real or emulated UE) is placed under excellent radio conditions (cell centre close to radiated eNB/gNB
6 antenna) using SINR thresholds as indicated in Section 3.6. The UE is powered on and attached to the network.

7 3. The downlink full-buffer UDP and TCP data transmission (see Section 3.4) from the application server should be
8 verified by adjusting the connection settings (cabled environment) or UE position (OTA environment). The UE under
9 excellent radio conditions that is achieving peak user throughput should see stable utilization of the highest possible
10 downlink MCS, downlink transport block size and downlink MIMO rank (number of layers). These KPIs should also
11 be verified.

12 4. The UE should be turned off or set to airplane mode, if possible, to empty the buffers. The downlink full-buffer UDP
13 data transmission from the application server to the UE is started. The UE should receive the data from the application
14 server.

15 5. All the required performance data (incl. the signalling and control data) as specified in the “Test requirements”
16 Section below is measured and captured at the UE and Application server sides using logging/measurement tools.
17 The duration of the test should be at least 3 minutes when the throughput is stable. The location and position of the
18 UE should remain unchanged during the entire measurement duration (capture of log data).

19 6. The capture of log data is stopped. The downlink full-buffer UDP data transmission from the application server is
20 stopped.

21 7. [Optional] Steps 4 to 6 are repeated for downlink full-buffer TCP data transmission.

22 5.2.4 Test requirements (expected results)


23 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
24 should be captured and reported in the test report for the performance assessment

25 UE side (real or emulated UE):

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 65
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

2 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

3 • Received downlink throughput (L1 and Application layers) (average sample per second)

4 • Downlink transmission mode

5 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
6 (average sample per second)

7 Application server side:

8 • Transmitted downlink throughput (Application layer) (average sample per second)

9 When the UE is under excellent radio conditions (cell centre), the stable utilization of the highest possible downlink MCS,
10 downlink transport block size and downlink MIMO rank should be observed. The UE should also receive the data with
11 minimum downlink BLER.

12 The Table 5-6 gives an example of the test results record (median and standard deviation from the captured samples
13 should be calculated for each metric). In case of 5G SA and NSA, SS-RSRP and SS-SINR should be reported. In case
14 of 5G NSA and dual connectivity (EN-DC), the values should be provided separately for both LTE and 5G paths. The
15 spectral efficiency (see Section 3.8) should be calculated for benchmarking and comparison purposes in order to
16 minimize the influence of different configured parameters such as bandwidth and TDD DL/UL ratio.

17 Table 5-6 Example record of test results (median and standard deviation from the captured samples)

UDP TCP
Received L1 DL throughput [Mbps]
L1 DL Spectral efficiency [bps/Hz]
Received Application DL throughput [Mbps]
UE RSRP [dBm]
UE RSRQ [dB]
UE SINR [dB]
MIMO rank
PDSCH MSC
DL PRB number
PDSCH BLER [%]
18 The following figures should be also included in the test report, and the stable behavior should be observed and evaluated.

19 • Received UDP downlink throughput (L1 and Application layer) vs Time duration

20 • PDSCH SINR vs Time duration

21 • Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots vs Time duration

22 The gap analysis should be provided for the measured and the expected target downlink throughputs which can be
23 calculated based on the procedures from Section 5.1.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 66
O-RAN.TIFG.E2E-Test.0-v04.00

1 5.3 Uplink peak throughput

2 5.3.1 Test description and applicability

3 The purpose of the test is to measure the peak (i.e. maximum achievable) user data throughput in uplink direction (i.e.
4 data transmitted from UE to application (traffic) server). The stationary UE is placed under excellent radio conditions
5 inside the isolated cell.

6 5.3.2 Test setup and configuration

7 The test setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with stationary
8 UE (real or emulated UE) placed under excellent radio conditions as defined in Section 3.6 – using RSRP as the metric
9 since this is an uplink test. Within the cell there should be only one active UE uploading data to the application server.
10 The application server should be placed as close as possible to the core/core emulator and connected to the core/core
11 emulator via a transport link with sufficient capacity so as not to limit the expected data throughput. The test is suitable
12 for both lab and field environments.

13 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
14 recorded in the test report.

15 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator/fading
16 generator inserted between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using
17 a UE emulator. The test environment should be setup to achieve excellent radio conditions (RSRP as defined in Section
18 3.6) for the UE, but the minimum coupling loss (see Section 3.6) should not be exceeded. The UE should be placed inside
19 an RF shielded box or RF shielded room if the UE is not connected via cable.

20 Field setup: The UE is placed in the centre of cell close to the radiated eNB/gNB antenna(s), where excellent radio
21 conditions (RSRP as defined in Section 3.6) should be observed. The minimum coupling loss (see 3.5) should not be
22 exceeded.

23 The test setups of 4G, 5G NSA and 5G SA are mentioned in Figure 5-1.

24 5.3.3 Test Procedure

25 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
26 report. The serving cell under test is activated and unloaded. All other cells are turned off.

27 2. The UE (real or emulated UE) is placed under excellent radio condition (cell centre close to radiated eNB/gNB
28 antenna) using RSRP thresholds as indicated in Section 3.6. The UE is powered on and attached to the network.

29 3. The uplink full-buffer UDP and TCP data transmission (see Section 3.4) from UE to the application server should be
30 verified by adjusting the connection settings (cabled environment) or UE position (OTA environment). The UE under
31 excellent radio conditions that is achieving peak user throughput should see stable utilization of the highest possible
32 uplink MCS and uplink transport block size. These KPIs should also be verified.

33 4. The UE should be turned off or set to airplane mode, if possible, to empty the buffers. The uplink full-buffer UDP
34 data transmission from UE to the application server is started. The application server should receive the sent data.

35 5. All the required performance data (incl. the signalling and control data) as specified in the following “Test
36 requirements” section is measured and captured at UE, eNB/gNB and Application server sides using
37 logging/measurement tools. The duration of test should be at least 3 minutes when the throughput is stable. The

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 67
O-RAN.TIFG.E2E-Test.0-v04.00

1 location and position of the UE should remain unchanged during the entire measurement duration (capture of log
2 data).

3 6. The capture of log data is stopped. The uplink full-buffer UDP data transmission from UE to the application server
4 is stopped.

5 7. The UE should be turned off or set to airplane mode, if possible, to empty the buffers. The uplink full-buffer TCP
6 data transmission from UE to the application server is started, and Step 5 is repeated.

7 5.3.4 Test requirements (expected results)

8 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
9 should be captured and reported in the test report for the performance assessment

10 UE side (real or emulated UE):

11 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

12 • PUSCH BLER, PUSCH MCS (average sample per second)

13 • Transmit power on PUSCH

14 • Transmitted uplink throughput (Application layer) (average sample per second)

15 • Channel utilization, i.e. Number allocated/occupied uplink PRBs and Number of allocated/occupied slots
16 (average sample per second)

17 eNB/gNB side (if capture of logs is possible):

18 • Radio parameters such as PUSCH SINR (average per second)

19 • PUSCH BLER (average sample per second)

20 Application server side:

21 • Received uplink throughput (L1 and Application layers) (average sample per second)

22 When the UE is in excellent radio condition (cell centre), the stable utilization of the highest possible uplink MCS and
23 uplink transport block size should be observed and evaluated. The eNB/gNB should also receive the data with the
24 minimum uplink BLER.

25 The Table 5-7 gives an example of the test results record (median and standard deviation from the measured samples
26 should be provided for each metric). In case of 5G SA and NSA, SS-RSRP and SS-SINR should be reported. In case of
27 5G NSA and dual connectivity (EN-DC), the values should be provided separately for both LTE and 5G. The spectral
28 efficiency (see Section 3.8) should be calculated for benchmarking and comparison in order to minimize the influence of
29 different configured parameters such as bandwidth and TDD DL/UL ratio.

30 Table 5-7 Example record of test results (median and standard deviation from the measured samples)

UDP TCP
Received L1 UL throughput [Mbps]
L1 UL Spectral efficiency [bps/Hz]
Received Application UL throughput [Mbps]
UE RSRP [dBm]
UE PDSCH SINR [dB]

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 68
O-RAN.TIFG.E2E-Test.0-v04.00

PUSCH transmit power [dBm]


PUSCH MSC
UL RB number
PUSCH BLER [%]
1 The following figures should be also included in the test report, and the stable behavior should be observed and evaluated.

2 • Received UDP uplink throughput (L1 and Application layers) vs Time duration

3 • UE RSRP vs Time duration

4 • Number of allocated/occupied uplink PRBs and Number of allocated/occupied slots vs Time duration

5 The gap analysis should be provided for the measured and the expected target uplink throughputs which can be calculated
6 based on the procedures from Section 5.1.

7 5.4 Downlink throughput in different radio conditions

8 5.4.1 Test description and applicability

9 The purpose of the test is to measure the user experienced data throughput in the downlink direction while varying
10 received radio signal quality (strength). The UE is placed in different stationary points inside the isolated cell.

11 5.4.2 Test setup and configuration

12 The test setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with stationary
13 UE (real or emulated UE) placed in different radio conditions as defined in Section 3.6 - SINR should be considered in
14 case of downlink. Note that in this case of single cell scenario, SINR is in fact SNR as inter-cell interferences are not
15 present. The UE is sequentially placed in different radio conditions ranging from good to poor. Note that the testing of
16 peak downlink throughout in excellent radio conditions is already covered in Section 5.2 and so is skipped in this section.
17 Within the cell there should be only one active UE in time downloading data from the application (traffic) server. The
18 application server should be placed as close as possible to the core/core emulator and connected to the core/core emulator
19 via a transport link with sufficient capacity so as not to limit the expected data throughput. The test is suitable for lab as
20 well as field environment.

21 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
22 recorded in the test report.

23 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator/fading
24 generator inserted between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using
25 a UE emulator. The radio conditions of UE are initially set to good. The minimum coupling loss (see Section 3.6) should
26 not be exceeded. The change in radio conditions of UE, from excellent through fair and good to poor, is achieved by
27 increasing the attenuation of radio signal. The UE should be placed inside RF shielded box or RF shielded room if the
28 UE is not connected via cable.

29 Field setup: The test points with good, fair and poor radio conditions (as defined in Section 3.6) should be defined inside
30 the serving cell. The minimum coupling loss (see 3.5) should not be exceeded. The UE is initially placed where good
31 radio conditions (SINR as defined in Section 3.6) should be observed. The change in radio conditions is achieved by
32 moving the UE inside the serving cell from close to cell centre (with good radio conditions) to cell edge (with poor radio
33 conditions).

34 The test setups of 4G, 5G NSA and 5G SA are mentioned in Figure 5-1.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 69
O-RAN.TIFG.E2E-Test.0-v04.00

1 5.4.3 Test Procedure

2 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
3 report. The serving cell under test is activated and unloaded. All other cells are turned off.

4 2. The UE (real or emulated UE) is placed under good radio conditions (close to cell centre) using SINR thresholds as
5 indicated in Section 3.6. The UE is powered on and attached to the network.

6 3. The downlink full-buffer UDP and TCP data transmission (see Section 3.4) from the application server should be
7 verified by adjusting the connection settings (cabled environment) or UE position (OTA environment) to achieve
8 good radio conditions.

9 4. The UE should be turned off or set to airplane mode, if possible, to empty the buffers. The downlink full-buffer UDP
10 data transmission from the application server to the UE is started. The UE should receive the data from the application
11 server.

12 5. All the required performance data (incl. the signalling and control data) as specified in the following “Test
13 requirements” section is measured and captured at UE and Application server sides using logging/measurement tools.
14 The duration of test should be at least 3 minutes when the throughput is stable. The location and position of the UE
15 should remain unchanged during the entire measurement duration (capture of log data).

16 6. The capture of log data is stopped. The downlink full-buffer UDP data transmission from the application server is
17 stopped.

18 7. The radio conditions of UE are changed to fair using SINR thresholds as indicated in Section 3.6. The steps 4 to 6
19 are repeated.

20 8. The radio conditions of UE are changed to poor (cell edge) radio condition using SINR thresholds as indicated in
21 Section 3.6. Steps 4 to 6 are repeated.

22 9. [Optional] Steps 4 to 8 are repeated for downlink full-buffer TCP data transmission.

23 5.4.4 Test requirements (expected results)

24 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
25 should be captured and reported in the test report for the performance assessment

26 UE side (real or emulated UE):

27 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

28 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

29 • Received downlink throughput (L1, and Application layers) (average sample per second)

30 • Downlink transmission mode

31 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
32 (average sample per second)

33 Application server side:

34 • Transmitted downlink throughput (Application layer) (average sample per second)

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 70
O-RAN.TIFG.E2E-Test.0-v04.00

1 As the UE moves from good (close to cell centre), to fair, and to poor (cell edge) radio conditions, the changing radio
2 conditions should cause the UE to report lower CQI and MIMO rank which results in assignment of lower MCS and
3 lower data throughput in the downlink.

4 The Table 5-8 gives an example of the test results record (median and standard deviation from the captured samples
5 should be calculated for each metric). In case of 5G SA and NSA, SS-RSRP and SS-SINR should be reported. In case of
6 5G NSA and dual connectivity (EN-DC), the values should be provided separately for both LTE and 5G paths. The
7 spectral efficiency (see Section 3.8) should be calculated for benchmarking and comparison in order to minimize the
8 influence of different configured parameters such as bandwidth and TDD DL/UL ratio.

9 Table 5-8 Example record of test results (median and standard deviation from the captured samples)

Poor
Good Fair
(cell edge)
UDP / TCP UDP / TCP UDP / TCP
Received L1 DL throughput [Mbps]
L1 DL Spectral efficiency [bps/Hz]
Received Application DL throughput [Mbps]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MSC
DL RB number
PDSCH BLER [%]
10 The following figures should be also included in the test report, and the stable behavior should be observed and evaluated.

11 • Received UDP downlink throughput (L1 and Application layer) vs Time duration

12 • PDSCH SINR vs Time duration

13 • Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots vs Time duration

14 The gap analysis should be provided for the measured and the expected target downlink throughputs which can be
15 calculated based on the procedures from Section 5.1.

16 5.5 Uplink throughput in different radio conditions

17 5.5.1 Test description and applicability

18 The purpose of the test is to measure the user experienced data throughput in uplink while varying received radio signal
19 quality (strength). The UE is placed in different stationary points inside the isolated cell.

20 5.5.2 Test setup and configuration


21 The test setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with stationary
22 UE (real or emulated UE) placed in different radio conditions as defined in Section 3.6 - RSRP should be considered in
23 case of uplink. The UE is gradually placed in different radio conditions from good (close to cell centre) to poor (cell edge
24 – coverage limited cell edge in case of single cell scenario). Note that the testing of peak uplink throughout in excellent
25 radio conditions is already covered in Section 5.3 and so is skipped in this section. Within the cell there should be only
26 one active UE in time uploading data to the application (traffic) server. The application server should be placed as close

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 71
O-RAN.TIFG.E2E-Test.0-v04.00

1 as possible to the core/core emulator and connected to the core/core emulator via a transport link with sufficient capacity
2 so as not to limit the expected data throughput. The test is suitable for both lab and field environments.

3 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
4 recorded in the test report.

5 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator/fading
6 generator inserted between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using
7 a UE emulator. The radio conditions of UE are initially set to good. The minimum coupling loss (see Section 3.6) should
8 not be exceeded. The change in radio conditions of UE, from good to fair to poor, is achieved by increasing the attenuation
9 of radio signal. The UE should be placed inside RF shielded box or RF shielded room if the UE is not connected via cable.

10 Field setup: The test points with good, fair and poor radio conditions (as defined in Section 3.6) should be identified
11 inside the serving cell. The minimum coupling loss (see 3.5) should not be exceeded. The UE is initially placed where
12 good radio conditions (RSRP as defined in Section 3.6) should be observed. The change in radio conditions is achieved
13 by moving the UE inside the serving cell from close to cell centre (with good radio conditions) to cell edge (with poor
14 radio conditions).

15 The test setups of 4G, 5G NSA and 5G SA are illustrated in Figure 5-1.

16 5.5.3 Test Procedure

17 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
18 report. The serving cell under test is activated and unloaded. All other cells are turned off.

19 2. The UE (real or emulated UE) is placed under good radio condition (close to cell centre) using RSRP thresholds as
20 indicated in Section 3.6. The UE is powered on and attached to the network.

21 3. The uplink full-buffer UDP and TCP data transmission (see Section 3.4) from UE to the application server should be
22 verified by adjusting the connection settings (cabled environment) or UE position (OTA environment) to achieve
23 good radio conditions.

24 4. The UE should be turned off or set to airplane mode, if possible, to empty the buffers. The uplink full-buffer UDP
25 data transmission from UE to the application server is started. The application server should receive the data from
26 UE.

27 5. All the required performance data (incl. the signalling and control data) as specified in the following “Test
28 requirements” section is measured and captured at UE, eNB/gNB and application server side using
29 logging/measurement tools. The duration of test should be at least 3 minutes when the throughput is stable. The
30 location and position of the UE should remain unchanged during the entire measurement duration (capture of log
31 data).

32 6. The capture of log data is stopped. The uplink full-buffer UDP data transmission from UE to the application server
33 is stopped.

34 7. The radio conditions of UE are changed to fair using RSRP thresholds as indicated in Section 3.6. The steps 4 to 6
35 are repeated.

36 8. The radio conditions of UE are changed to poor (cell edge) radio condition using RSRP thresholds as indicated in
37 Section 3.6. Steps 4 to 6 are repeated.

38 9. [Optional] Steps 4 to 8 are repeated for uplink full-buffer TCP data transmission.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 72
O-RAN.TIFG.E2E-Test.0-v04.00

1 5.5.4 Test requirements (expected results)

2 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
3 should be captured and reported in the test report for the performance assessment

4 UE side (real or emulated UE):

5 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

6 • PUSCH BLER, PUSCH MCS (average sample per second)

7 • Transmit power on PUSCH

8 • Transmitted uplink throughput (Application layer) (average sample per second)

9 • Channel utilization, i.e. Number allocated/occupied uplink PRBs and Number of allocated/occupied slots
10 (average sample per second)

11 eNB/gNB side (if capture of logs is possible):

12 • Radio parameters such as PUSCH SINR (average per second)

13 • PUSCH BLER (average per second)

14 Application server side:

15 • Received uplink throughput (L1 and Application layers) (average sample per second)

16 The Table 5-9 gives an example of the test results record (median and standard deviation from the captured samples
17 should be calculated for each metric). In case of 5G SA and NSA, SS-RSRP and SS-SINR should be reported. In case of
18 5G NSA and dual connectivity (EN-DC), the values should be provided separately for both LTE and 5G paths. The
19 spectral efficiency (see Section 3.8) should be calculated for benchmarking and comparison in order to minimize the
20 influence of different configured parameters such as bandwidth and TDD DL/UL ratio.

21 Table 5-9 Example record of test results (median and standard deviation from the measured samples)

Poor
Good Fair
(cell edge)
UDP / TCP UDP / TCP UDP / TCP
Received L1 UL throughput [Mbps]
L1 UL Spectral efficiency [bps/Hz]
Received Application UL throughput [Mbps]
UE RSRP [dBm]
UE PDSCH SINR [dB]
PUSCH transmit power [dBm]
PUSCH MSC
UL RB number
PUSCH BLER [%]
22 The following figures should be also included in the test report, and the stable behavior should be observed and evaluated.

23 • Received UDP uplink throughput (L1 and Application layers) vs Time duration

24 • UE RSRP vs Time duration

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 73
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots vs Time duration

2 The gap analysis should be provided for the measured and the expected target uplink throughputs which can be calculated
3 based on the procedures from Section 5.1.

4 5.6 Bidirectional throughput in different radio conditions

5 5.6.1 Test description and applicability

6 The purpose of the test is to measure the user experienced data throughput in both downlink and uplink in parallel while
7 varying received radio signal quality (strength). The UE is placed in different stationary points inside the isolated cell.
8 The test also includes the measurement of peak (maximum achievable) data throughput of UE located in the excellent
9 radio conditions, and cell edge coverage data throughput of UE located at the cell edge in the poor radio conditions.

10 5.6.2 Test setup and configuration

11 The test setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with stationary
12 UE (real or emulated) placed in different radio conditions as defined in Section 3.6 –RSRP should be considered in this
13 case. The UE is gradually placed in different radio conditions from excellent (cell centre) to poor (cell edge – coverage
14 limited cell edge in case of single cell scenario). Within the cell there should be only one active UE in time simultaneously
15 downloading data from and uploading data to the application (traffic) server. The application server should be placed as
16 close as possible to the core/core emulator and connected to the core/core emulator via a transport link with sufficient
17 capacity so as not to limit the expected data throughput. The test is suitable for both lab and field environments.

18 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
19 recorded in the test report.

20 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator/fading
21 generator inserted between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using
22 a UE emulator. The radio conditions of UE are initially set to excellent. The minimum coupling loss (see Section 3.6)
23 should not be exceeded. The change in radio conditions of UE, from excellent through fair and good to poor, is achieved
24 by increasing the attenuation of radio signal. The UE should be placed inside RF shielded box or RF shielded room if the
25 UE is not connected via cable.

26 Field setup: The test points with excellent, good, fair and poor radio conditions (as defined in Section 3.6) should be
27 identified inside the serving cell. The minimum coupling loss (see 3.5) should not be exceeded. The UE is initially placed
28 in the cell center close to the eNB/gNB antenna(s), where excellent radio conditions (using RSRP as the metric as defined
29 in Section 3.6) should be observed. The change in radio conditions is achieved by moving the UE inside the serving cell
30 from cell centre (with excellent radio conditions) to cell edge (with poor radio conditions).

31 The test setups of 4G, 5G NSA and 5G SA are illustrated in Figure 5-1.

32 5.6.3 Test Procedure

33 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
34 report. The serving cell under test is activated and unloaded. All other cells are turned off.

35 2. The UE (real or emulated) is placed under excellent radio conditions as using RSRP thresholds as indicated in Section
36 3.6. The UE is powered on and attached to the network.

37 3. The simultaneous downlink and uplink full-buffer UDP and TCP data transmission (see Section 3.4) should be
38 verified by adjusting the connection settings (cabled environment) or UE position (OTA environment). The UE under

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 74
O-RAN.TIFG.E2E-Test.0-v04.00

1 excellent radio conditions that is achieving peak uplink and downlink user throughput should see stable utilization of
2 the highest possible MCS, downlink block size and MIMO rank (number of layers). These KPIs should also be
3 verified.

4 4. The UE should be turned off or set to airplane mode, if possible, to empty the buffers. The simultaneous downlink
5 and uplink full-buffer UDP data transmissions are started. Both the UE and application server should receive the data.

6 5. All the required performance data (incl. the signalling and control data) as specified in the “Test requirements”
7 Section below are measured and captured at UE, eNB/gNB and application server side using logging/measurement
8 tools. The duration of test should be at least 3 minutes when the throughput is stable. The location and position of the
9 UE should remain unchanged during the entire measurement duration (capture of log data).

10 6. The capture of log data is stopped. The simultaneous downlink and uplink full-buffer UDP data transmissions are
11 stopped.

12 7. The radio conditions of UE are changed to good as defined by both SINR and RSRP in Section 3.6, if possible. The
13 steps 4 to 6 are repeated.

14 8. The radio conditions of UE are changed to fair as defined by both SINR and RSRP in Section 3.6, if possible. The
15 steps 4 to 6 are repeated.

16 9. The radio conditions of UE are changed to poor (cell edge) radio condition as defined by both SINR and RSRP in
17 Section 3.6, if possible. The steps 4 to 6 are repeated.

18 10. [Optional] Steps 4 to 9 are repeated for simultaneous downlink and uplink full-buffer TCP data transmission.

19 5.6.4 Test requirements (expected results)

20 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
21 should be captured and reported in the test report for the performance assessment

22 UE side (real or emulated):

23 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

24 • PDSCH BLER, PDSCH MCS, PUSCH BLER, PUSCH MCS (average sample per second)

25 • DL MIMO rank (number of layers) (average sample per second)

26 • Downlink transmission mode

27 • Transmit power on PUSCH

28 • Received downlink throughput (L1, and Application layers) (average sample per second)

29 • Transmitted uplink throughput (Application layer) (average sample per second)

30 • Channel utilization, i.e. Number allocated/occupied PRBs and Number of allocated/occupied slots in both
31 downlink and uplink directions (average sample per second)

32 eNB/gNB side (if capture of logs is possible):

33 • Radio parameters such as PUSH SINR (average per second)

34 Application server side:

35 • Received uplink throughput (L1 and Application layers) (average sample per second)

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 75
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Transmitted downlink throughput (Application layer) (average sample per second)

2 When the UE is under excellent radio conditions (cell centre), the stable utilization of the highest possible MCS and
3 transport block size in both uplink and downlink direction should be observed and evaluated. The UE and eNB/gNB
4 should also receive the data with the minimum downlink and uplink BLER, respectively.

5 The Table 5-10 gives an example of the test results record (median and standard deviation from the captured samples
6 should be calculated for each metric). In case of 5G SA and NSA, SS-RSRP and SS-SINR should be reported. In case of
7 5G NSA and dual connectivity (EN-DC), the values should be provided separately for both LTE and 5G paths. The
8 spectral efficiency (see Section 3.8) should be calculated for benchmarking and comparison in order to minimize the
9 influence of different configured parameters such as bandwidth and TDD DL/UL ratio.

10 Table 5-10 Example record of test results (median and standard deviation from the captured samples)

Excellent Poor
Good Fair
(cell centre) (cell edge)
UDP / TCP UDP / TCP UDP / TCP UDP / TCP
Received L1 UL throughput [Mbps]
L1 UL Spectral efficiency [bps/Hz]
Received Application UL throughput [Mbps]
Received L1 DL throughput [Mbps]
L1 DL Spectral efficiency [bps/Hz]
Received Application DL throughput [Mbps]
UE RSRP [dBm]
UE PDSCH SINR [dB]
PUSCH transmit power [dBm]
DL MIMO rank
DL MSC
UL MSC
DL RB number
UL RB number
DL PDSCH BLER [%]
UL PUSCH BLER [%]
11

12 The following figures should be also included in the test report, and the stable behavior should be observed and evaluated.

13 • Received UDP uplink throughput (L1 and Application layer) vs Time duration

14 • Received UDP downlink throughput (L1 and Application layer) vs Time duration

15 • UE RSRP vs Time duration

16 • UE PDSCH SINR vs Time duration

17 • Number of allocated/occupied downlink PRBs and Number of allocated/occupied downlink slots vs Time
18 duration

19 • Number of allocated/occupied uplink PRBs and Number of allocated/occupied uplink slots vs Time duration

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 76
O-RAN.TIFG.E2E-Test.0-v04.00

1 The bidirectional DL and UL throughputs should be compared with unidirectional downlink (Section 5.4) and uplink
2 (Section 5.5) throughputs. Assuming the same test conditions (radio conditions), the bidirectional and unidirectional
3 throughputs are expected to be equal.

4 The gap analysis should be provided for the measured and the expected target downlink and uplink throughputs which
5 can be calculated based on the procedures from Section 5.1.

6 5.7 Downlink coverage throughput (link budget)

7 5.7.1 Test description and applicability

8 The purpose of the test is to measure the downlink user data throughput (i.e. data transmitted from application (traffic)
9 server to UE) when radio conditions of UE change gradually. Test is verified by moving UE from center to edge of the
10 isolated cell on the main lobe of eNB/gNB antenna until UE loses the coverage (call drop). Test assesses link adaptation
11 and effect on scheduling, CQI, MCS, Number of layers (MIMO Rank) assignment etc. during the movement of UE from
12 excellent radio conditions to poor radio conditions.

13 5.7.2 Test setup and configuration

14 The test setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with UE (real
15 or emulated) slowly moving in the main lobe of eNB/gNB antenna out from cell centre to cell edge until UE loses the
16 coverage (call drop). The drive route inside the cell should be defined to cover the whole range of SINR values from
17 excellent (cell center) to poor (cell edge) radio conditions as defined in Section 3.6. Note that in case of single cell scenario
18 SINR is in fact SNR as inter-cell interferences are not present. Within the cell there should be only one active UE
19 downloading UDP/TCP data from the application server. The application server should be placed as close as possible to
20 the core/core emulator and connected to the core/core emulator via a transport link with sufficient capacity so as not to
21 limit the expected data throughput. The test is suitable for both lab and field environments.

22 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
23 recorded in the test report.

24 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator/fading
25 generator inserted between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using
26 a UE emulator. The radio conditions of UE are initially set to excellent. The minimum coupling loss (see Section 3.6)
27 should not be exceeded. The movement of UE out from cell centre to cell edge can be achieved by gradually increasing
28 the attenuation of radio signal to cover the whole range of SINR from excellent through good and fair to poor (as defined
29 in Section 3.6) until UE loses the coverage (call drop). The UE should be placed inside RF shielded box or RF shielded
30 room if the UE is not connected via cable.

31 Field setup: The drive route inside the isolated cell should be defined to cover the whole range of SINR values from
32 excellent (cell center) through good and fair to poor (cell edge) (as defined in Section 3.6) until UE loses the coverage
33 (call drop). The UE is placed in the centre of cell close to the radiated eNB/gNB antenna(s). The minimum coupling loss
34 (see 3.5) should not be exceeded. The change in radio conditions is achieved by moving the UE along the drive route out
35 from cell centre to cell edge until UE loses the coverage (call drop) – see Figure 5-2.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 77
O-RAN.TIFG.E2E-Test.0-v04.00

UE

2 Figure 5-2 Testing of link budget in the field setup

3 5.7.3 Test Procedure

4 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
5 report. The serving cell under test is activated and unloaded. All other cells are turned off.

6 2. The UE (real or emulated) is placed under excellent radio condition (cell centre) using SINR thresholds as indicated
7 in Section 3.6. The UE is powered on and attached to the network.

8 3. The downlink full-buffer UDP and TCP data transmission (see Section 3.4) from the application server should be
9 verified. The excellent radio conditions experiencing peak user throughput is identified with stable utilization of the
10 highest possible downlink MCS, downlink transport block size and downlink MIMO rank (number of layers). These
11 KPIs should also be verified.

12 4. The UE should be turned off or set to airplane mode, if possible, to empty the buffers. The downlink full-buffer UDP
13 data transmission from the application server to the UE is started. The UE should receive the data from the application
14 server.

15 5. All the required performance data (incl. the signalling and control data) as specified in the following “Test
16 requirements” section is measured and captured at UE and Application server sides using logging/measurement tools.

17 6. In the field setup, the UE is moved along the defined drive route out from cell centre (excellent radio conditions) to
18 cell edge (poor radio conditions) on the main lobe of eNB/gNB antenna and with constant speed of around 30kph
19 until UE loses the coverage (call drop).

20 7. In the lab setup, the attenuation between the antenna connectors of O-RU and UE is gradually increased until UE
21 losses the coverage (call drop).

22 8. The capture of log data is stopped. The downlink full-buffer UDP data transmission from the application server to
23 UE is stopped.

24 9. [Optional] Steps 5 to 7 are repeated for downlink full-buffer TCP data transmission.

25 5.7.4 Test requirements (expected results)

26 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
27 should be captured and reported in the test report for the performance assessment

28 UE side (real or emulated):

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 78
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

2 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

3 • Received downlink throughput (L1 and Application layers) (average sample per second)

4 • Downlink transmission mode (average sample per second)

5 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
6 (average sample per second)

7 • GPS coordinates (latitude, longitude) in the field setup

8 Application server side:

9 • Transmitted downlink throughput (Application layer) (average sample per second)

10 Initially when the UE is in the excellent radio conditions (cell centre), the stable utilization of the highest possible
11 downlink MCS, downlink transport block size and downlink MIMO rank should be observed and evaluated. The UE
12 should also receive the data with the minimum downlink BLER. As the UE moves out from excellent radio conditions
13 (cell centre), the radio conditions of the UE change gradually from excellent through good and fair to poor (see Section
14 3.6). The changing radio conditions may cause UE reporting lower CQI and MIMO rank which results in assignment of
15 lower MCS and lower data throughput in downlink. Such a behavior should be observed and evaluated.

16 Note that the test results can be affected by the various coverage enhancing features (e.g. transmit/receive diversity) at
17 eNB/gNB and/or at UE sides. The used coverage enhancement features should be also listed in the test report.

18 The following figures should be included in the test report, and the behavior should be evaluated. In case of 5G SA and
19 NSA, SS-RSRP and SS-SINR should be reported. In case of 5G NSA and dual connectivity (EN-DC), the values should
20 be provided separately for both LTE and 5G paths. The spectral efficiency (see Section 3.8) should be calculated for
21 benchmarking and comparison in order to minimize the influence of different configured parameters such as bandwidth
22 and TDD DL/UL ratio.

23 • Received UDP downlink throughput (L1 and Application layer) vs PDSCH SINR (all the samples as well as
24 average curve with median values)

25 • Received UDP L1 downlink spectral efficiency vs PDSCH SINR (all the samples as well as average curve with
26 median values)

27 • Cumulative distribution function of PDSCH SINR

28 • Cumulative distribution function of PDSCH MCS

29 • Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots vs PDSCH SINR (all the
30 samples as well as average curve with median values)

31 In the field environment, the cell coverage distance in meters (i.e. a straight line from the cell site) that corresponds to the
32 cell-edge downlink application throughput of 6 Mbps for LTE and 50 Mbps for 5G NR should be recorded.

33 5.8 Uplink coverage throughput (link budget)

34 5.8.1 Test description and applicability

35 The purpose of the test is to measure the uplink user data throughput (i.e. data transmitted from UE to application (traffic)
36 server) when radio conditions of UE change gradually. Test is verified by moving UE from center to edge of the isolated
37 cell on the main lobe of eNB/gNB antenna until UE loses the coverage (call drop). Test assesses link adaptation and effect

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 79
O-RAN.TIFG.E2E-Test.0-v04.00

1 on scheduling, uplink transmit power (power control), CQI, MCS, etc. during the movement of UE from excellent radio
2 conditions to poor radio conditions.

3 5.8.2 Test setup and configuration

4 The test setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with UE (real
5 or emulated) slowly moving in the main lobe of eNB/gNB antenna out from cell centre to cell edge until UE loses the
6 coverage (call drop). The drive route inside the cell should be defined to cover the whole range of RSRP values from
7 excellent (cell center) to poor (cell edge) radio conditions as defined in Section 3.6. Within the cell there should be only
8 one active UE uploading UDP/TCP data to application server. The application server should be placed as close as possible
9 to the core/core emulator and connected to the core/core emulator via a transport link with sufficient capacity so as not to
10 limit the expected data throughput. The test is suitable for both lab and field environments.

11 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
12 recorded in the test report.

13 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator/fading
14 generator inserted between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using
15 a UE emulator. The radio conditions of UE are initially set to excellent. The minimum coupling loss (see Section 3.6)
16 should not be exceeded. The movement of UE out from cell centre to cell edge can be achieved by gradually increasing
17 the attenuation of radio signal to cover the whole range of RSRP from excellent through good and fair to poor (as defined
18 in Section 3.6) until UE loses the coverage (call drop). The UE should be placed inside RF shielded box or RF shielded
19 room if the UE is not connected via cable.

20 Field setup: The drive route inside the isolated cell should be defined to cover the whole range of RSRP values from
21 excellent (cell center) through good and fair to poor (cell edge) (as defined in Section 3.6) until UE loses the coverage
22 (call drop). The UE is placed in the centre of cell close to the radiated eNB/gNB antenna(s). The minimum coupling
23 loss (see 3.5) should not be exceeded. The change in radio conditions is achieved by moving the UE along the drive
24 route out from cell centre to cell edge until UE loses the coverage (call drop) – see Figure 5-2.

25 5.8.3 Test Procedure

26 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
27 report. The serving cell under test is activated and unloaded. All other cells are turned off.

28 2. The UE (real or emulated) is placed under excellent radio condition (cell centre) using RSRP thresholds as indicated
29 in Section 3.6. The UE is powered on and attached to the network.

30 3. The uplink full-buffer UDP and TCP data transmission (see Section 3.4) from UE to the application server should be
31 verified. The UE under excellent radio conditions that is achieving peak user throughput should see stable utilization
32 of the highest possible uplink MCS and uplink transport block size. These KPIs should also be verified.

33 4. The UE should be turned off or set to airplane mode, if possible, to empty the buffers. The uplink full-buffer UDP
34 data transmission from the application server to the UE is started. The application server should receive data from
35 the UE.

36 5. All the required performance data (incl. the signalling and control data) as specified in the following “Test
37 requirements” section is measured and captured at UE, eNB/gNB and Application server sides using
38 logging/measurement tools.

39 6. In the field setup, the UE is moved along the defined drive route out from cell centre (excellent radio conditions) to
40 cell edge (poor radio conditions) on the main lobe of eNB/gNB antenna and with constant speed of around 30kph
41 until UE loses the coverage (call drop).

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 80
O-RAN.TIFG.E2E-Test.0-v04.00

1 7. In the lab setup, the attenuation between the antenna connectors of O-RU and UE is gradually increased until UE
2 losses the coverage (call drop).

3 8. The capture of log data is stopped. The uplink full-buffer UDP data transmission from UE to the application server
4 is stopped.

5 9. [Optional] Steps 4 to 7 are repeated for uplink full-buffer TCP data transmission.

6 5.8.4 Test requirements (expected results)

7 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
8 should be captured and reported in the test report for the performance assessment

9 UE side (real or emulated):

10 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

11 • PUSCH BLER, PUSCH MCS (average sample per second)

12 • Transmit power on PUSCH (average sample per second)

13 • Transmitted uplink throughput (Application layer) (average sample per second)

14 • Channel utilization, i.e. Number allocated/occupied uplink PRBs and Number of allocated/occupied slots
15 (average sample per second)

16 • GPS coordinates (latitude, longitude) in the field setup

17 eNB/gNB side (if capture of logs is possible):

18 • Radio parameters such as PUSCH SINR (average per second)

19 • PUSCH BLER (average per second)

20 Application server side:

21 • Received uplink throughput (L1 and Application layers) (average sample per second)

22 Initially when the UE is in the excellent radio conditions (cell centre), the stable utilization of the highest possible uplink
23 MCS and uplink transport block size should be observed and evaluated. The eNB/gNB should also receive the data with
24 the minimum uplink BLER. As the UE moves out from excellent radio conditions (cell centre), the radio conditions of
25 the UE change gradually from excellent through good and fair to poor (see Section 3.6). With deteriorating channel
26 conditions and increasing path loss, eNB/gNB would start experiencing worse PUSCH/PUCCH BLER. To compensate
27 the path loss and limit PUSCH/PUCCH BLER, eNB/gNB shall command the UE to increate PUSCH/PUCCH transmit
28 power through closed loop Transmit Power Control (TPC) feature. The number of HARQ retransmissions may be also
29 increased. If the power control does not reduce PUSCH/PUCCH BLER bellow a threshold, eNB/gNB shall schedule UE
30 with lower MCS and lower MIMO rank in uplink which results in lower uplink data throughput. The UE moving further
31 away from eNB/gNB can cause radio link failure and call drop (due to UL Max RLC retransmissions, failure in CQI
32 decoding, low uplink SINR, etc.) which results in triggering of RRC connection re-establishment procedure. Such a
33 behavior should be observed and evaluated.

34 Note that the test results can be affected by the various coverage enhancing features (e.g. transmit/receive diversity) at
35 eNB/gNB and/or at UE sides. The used coverage enhancement features should be also listed in the test report.

36 The following figures should be included in the test report, and the behavior should be evaluated. In case of 5G SA and
37 NSA, SS-RSRP and SS-SINR should be reported. In case of 5G NSA and dual connectivity (EN-DC), the values should
38 be provided separately for both LTE and 5G paths. The spectral efficiency (see Section 3.8) should be calculated for

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 81
O-RAN.TIFG.E2E-Test.0-v04.00

1 benchmarking and comparison in order to minimize the influence of different configured parameters such as bandwidth
2 and TDD DL/UL ratio.

3 • Received UDP uplink throughput (L1 and Application layer) vs RSRP (all the samples as well as average curve
4 with median values)

5 • Received UDP L1 uplink spectral efficiency vs RSRP (all the samples as well as average curve with median
6 values)

7 • PUSCH transmit power vs RSRP

8 • Cumulative distribution function of RSRP

9 • Cumulative distribution function of PUSCH MCS

10 • Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots vs RSRP (all the samples
11 as well as average curve with median values)

12 In the field environment, the cell coverage distance in meters (i.e. a straight line from the cell site) that corresponds to
13 the cell-edge uplink application throughput of 0.5 Mbps for LTE and 5 Mbps for 5G NR should be recorded.

14 5.9 Downlink aggregated cell throughput (cell capacity)

15 5.9.1 Test description and applicability

16 The purpose of the test is to validate the downlink aggregated cell throughput (downlink cell capacity) when the UEs are
17 distributed in a uniform or non-uniform way inside a cell. This test also captures the influence of MU-MIMO feature on
18 aggregated cell throughput. However, the test does not require support of MU-MIMO or massive MIMO. The aggregated
19 cell throughput depends on the multiple factors like spatial distribution of UEs inside a cell, radio conditions of UEs, test
20 configuration supported by both UE and eNB/gNB, etc. The test covers two spatial distribution scenarios of test UEs, i.e.
21 uniform distribution and non-uniform distribution.

22 5.9.2 Test setup and configuration

23 The test setup is a single cell scenario (i.e. an isolated cell without any inter-cell interference – see Section 3.7) with 10
24 stationary UEs (real or emulated UEs) in total where 1 UE should be placed in excellent radio conditions, 2 UEs in good
25 radio conditions, 4 UEs in fair radio conditions and 3 UEs in poor radio conditions. The radio conditions are defined in
26 Section 3.6 – SINR (range of values) should be considered in case of downlink. Note that in this case of single cell
27 scenario, SINR is in fact SNR as inter-cell interferences are not present.

28 The UEs can be distributed in the following spatial distribution scenarios (see Figure 5-3):

29 a) uniform distribution: ten UEs are placed uniformly with a spatial separation in both horizontal and vertical
30 directions. Uniform distribution maximizes the potential of MU-MIMO, because the spatially separated UEs can
31 be served by different beams at the same time. The channel experienced by each UE is spatially un-correlated
32 which reduces the inter-beam interference. In addition, selecting orthogonal precoders will further reduce the
33 inter-beam interference.

34 b) non-uniform distribution: the UEs are grouped in clusters. The UEs in the cluster experience very similar radio
35 conditions. In addition, the UEs in the cluster cannot be spatially separated and served by different beams at the
36 same time. This distribution scenario restricts the scheduling options of eNB/gNB resulting in lower aggregated
37 cell throughput.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 82
O-RAN.TIFG.E2E-Test.0-v04.00

1 Figure 5-3 also shows impact of different types of beamforming (namely horizontal beamforming, vertical
2 beamforming and 3D beamforming) on aggregated cell throughput. The number of vertical beams is usually lower
3 compared to horizontal beams, i.e. the number of UEs which can be spatially separated and served by different vertical
4 beams at the same time is also lower resulting in lower aggregated cell throughput compared to horizontal
5 beamforming. The typical scenario for vertical beamforming is a high-rise building where each floor can be covered by
6 different vertical beam. For example, in Figure 5-3 The distribution scenarios of UEs inside the cell using horizontal,
7 vertical or 3D beamforming

8 a), UEs 2 and 4 are located in the coverage of the same horizontal beam thus cannot be served at the same time, while
9 in Figure 5-3 The distribution scenarios of UEs inside the cell using horizontal, vertical or 3D beamforming

10 b), the same UEs 2 and 4 are located in the coverage of two different vertical beams thus can be served at the same
11 time.

Excellen t radio con dition s

Good radio conditions


8 8
Fair radio conditions

Poor radio conditions

4 4
2 2
1 1
beam 1
5 5
beam 2
3 3
9 9
6 6 beam 3

7 7

a) uniform distribution with horizontal beamforming b) uniform distribution with vertical beamforming

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 83
O-RAN.TIFG.E2E-Test.0-v04.00

4
4 7 6
2 5
1 1
5 2
3

3
9
6

7
89

c) uniform distribution with 3D beamforming d) non-uniform distribution with horizontal beamforming

1 Figure 5-3 The distribution scenarios of UEs inside the cell using horizontal, vertical or 3D beamforming

2 Within the cell there should be 10 active UEs in total downloading data from the application server. The application server
3 should be placed as close as possible to the core/core emulator and connected to the core/core emulator via a transport
4 link with sufficient capacity so as not to limit the expected data throughput. The test is suitable for both lab and field
5 environments.

6 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
7 recorded in the test report.
8
9 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator/fading
10 generator inserted between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using
11 a UE emulator. It is recommended to use a UE emulator to properly distribute the UEs inside a cell according to selected
12 distribution scenario. The minimum coupling loss (see Section 3.6) should not be exceeded. The UEs should be placed
13 inside an RF shielded box or RF shielded room if the UEs are not connected via cables.
14
15 Field setup: The UEs should be distributed inside a cell according to selected distribution scenario. The minimum
16 coupling loss (see Section 3.6) should not be exceeded.

17 5.9.3 Test Procedure

18 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
19 report. The serving cell under test is activated and unloaded. All other cells are turned off.

20 2. Ten UEs (real or emulated) are placed inside a serving cell according to uniform distribution scenario, and in radio
21 conditions using SINR thresholds as indicated in Section 3.6 - 1 UE should be placed in excellent radio conditions, 2
22 UEs in good radio conditions, 4 UEs in fair radio conditions and 3 UEs in poor radio conditions. The UEs are powered
23 on and attached to the network.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 84
O-RAN.TIFG.E2E-Test.0-v04.00

1 3. The downlink full-buffer UDP and TCP data transmissions (see Section 3.4) from the application server to all UEs
2 should be verified.

3 4. The UEs should be turned off or set to airplane mode, if possible, to empty the buffers. The downlink full-buffer UDP
4 data transmissions from the application server to all UEs are started. All UEs should receive the data from application
5 server.

6 5. All the required performance data (incl. the signalling and control data) as specified in the following “Test
7 requirements” section is measured and captured at all UEs and Application server using logging/measurement tools.

8 6. The capture of log data is stopped. The downlink full-buffer UDP data transmission from the application server to
9 UEs is stopped.

10 7. [Optional] Steps 5 to 6 are repeated for downlink full-buffer TCP data transmissions.

11 8. [Optional] Non-uniform spatial distribution scenario of UEs is setup. Steps 3 to 7 are repeated.

12 5.9.4 Test requirements (expected results)

13 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
14 should be captured and reported in the test report for the performance assessment

15 UE side (real or emulated UE):

16 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

17 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

18 • Received downlink throughput (L1, and Application layers) (average sample per second)

19 • Downlink transmission mode

20 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
21 (average sample per second)

22 • GPS coordinates (latitude, longitude) in the field setup

23 Application server side:

24 • Transmitted downlink throughput (Application layer) (average sample per second)

25 The Table 5-11 Example record of test results (median and standard deviation from the captured samples)

26 gives an example of the test results record (median and standard deviation from the captured samples should be
27 calculated for each metric). In case of 5G SA and NSA, SS-RSRP and SS-SINR should be reported. In case of 5G NSA
28 and dual connectivity (EN-DC), the values should be provided separately for both LTE and 5G paths. The spectral
29 efficiency (see Section 3.8) should be calculated for benchmarking and comparison in order to minimize the influence
30 of different configured parameters such as bandwidth and TDD DL/UL ratio.

31 Table 5-11 Example record of test results (median and standard deviation from the captured samples)

For each UE (UE 1 to UE 10)


UDP / TCP
Spatial distribution scenario [uniform/non-uniform]
Received L1 DL throughput [Mbps]
Aggregated cell L1 DL throughput [Mbps] sum of Received L1 DL throughput (UE 1 to UE 10)

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 85
O-RAN.TIFG.E2E-Test.0-v04.00

L1 DL Spectral efficiency [bps/Hz]


Received Application DL throughput [Mbps]
Aggregated cell App. DL throughput [Mbps] sum of Received App. DL throughput (UE 1 to UE 10)
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MSC
DL RB number
Aggregated DL RB number sum of DL RB number (UE 1 to UE 10)
PDSCH BLER [%]
1

2 In addition, the used spatial distribution scenario(s) should be properly described and depicted.

3 The following figures should be also included in the test report, and the stable behavior should be observed and evaluated.

4 • received total UDP/TCP downlink throughput at Application server (L1 and Application layers) vs Time
5 duration

6 For each UE:

7 • UE PDSCH SINR vs Time duration

8 • number of allocated/occupied downlink PRBs and Number of allocated/occupied slots vs Time duration

9 5.10 Uplink aggregated cell throughput (cell capacity)

10 5.10.1 Test description and applicability

11 The purpose of the test is to validate the uplink aggregated cell throughput (uplink cell capacity) when the UEs are
12 distributed in a uniform or non-uniform way inside a cell. This test also captures the influence of MU-MIMO feature on
13 aggregated cell throughput. The aggregated cell throughput depends on the multiple factors like spatial distribution of
14 UEs inside a cell, radio conditions of UEs, test configuration supported by both UE and eNB/gNB, etc. The test covers
15 two spatial distribution scenarios of test UEs, i.e. uniform distribution and non-uniform distribution.

16 5.10.2 Test setup and configuration

17 The test setup is a single cell scenario (i.e. an isolated cell without any inter-cell interference – see Section 3.7) with 10
18 stationary UEs (real or emulated UEs) in total where 1 UE should be placed in excellent radio conditions, 2 UEs in good
19 radio conditions, 4 UEs in fair radio conditions and 3 UEs in poor radio conditions. The radio conditions are defined in
20 Section 3.6 – RSRP (range of values) should be considered in case of uplink.

21 The UEs can be distributed in the following spatial distribution scenarios (see Figure 5-3):

22 a) uniform distribution: ten UEs are placed uniformly with a spatial separation in both horizontal and vertical
23 directions. Uniform distribution maximizes the potential of MU-MIMO, because the spatially separated UEs can
24 be served by different beams at the same time. The channel experienced by each UE is spatially un-correlated
25 which reduces the inter-beam interference.

26 b) non-uniform distribution: the UEs are grouped in clusters. The UEs in the cluster experience very similar radio
27 conditions. In addition, the UEs in the cluster cannot be spatially separated and served by different beams at the
_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 86
O-RAN.TIFG.E2E-Test.0-v04.00

1 same time. This distribution scenario restricts the scheduling options of eNB/gNB resulting in lower aggregated
2 cell throughput.

3 Within the cell there should be 10 active UEs in total uploading data to the application server. The application server
4 should be placed as close as possible to the core/core emulator and connected to the core/core emulator via a transport
5 link with sufficient capacity so as not to limit the expected data throughput. The test is suitable for both lab and field
6 environments.

7 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
8 recorded in the test report.
9
10 Laboratory setup: The radio conditions experienced by the UE can be modified using a fading generator inserted
11 between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using a UE emulator.
12 It is recommended to use a UE emulator to properly distribute the UEs inside a cell according to selected distribution
13 scenario. The minimum coupling loss (see Section 3.6) should not be exceeded. The UEs should be placed inside an RF
14 shielded box or RF shielded room if the UEs are not connected via cables.
15
16 Field setup: The UEs should be distributed inside a cell according to selected distribution scenario. The minimum
17 coupling loss (see Section 3.6) should not be exceeded.

18 5.10.3 Test Procedure

19 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
20 report. The serving cell under test is activated and unloaded. All other cells are turned off.

21 2. Ten UEs (real or emulated) in total are placed inside a serving cell according to uniform distribution scenario, and in
22 radio conditions using RSRP thresholds as indicated in Section 3.6 - 1 UE should be placed in excellent radio
23 conditions, 2 UEs in good radio conditions, 4 UEs in fair radio conditions and 3 UEs in poor radio conditions. The
24 UEs are powered on and attached to the network.

25 3. The uplink full-buffer UDP and TCP data transmissions (see Section 3.4) from all UEs to application server should
26 be verified.

27 4. The UEs should be turned off or set to airplane mode, if possible, to empty the buffers. The uplink full-buffer UDP
28 data transmissions from all UEs to the application server are started. The application server should receive data from
29 all UEs.

30 5. All the required performance data (incl. the signalling and control data) as specified in the following “Test
31 requirements” section is measured and captured at all UEs, eNB/gNB and Application server using
32 logging/measurement tools.

33 6. The capture of log data is stopped. The uplink full-buffer UDP data transmissions from the UEs to application server
34 are stopped.

35 7. [Optional] Steps 4 to 6 are repeated for uplink full-buffer TCP data transmissions.

36 8. [Optional] Non-uniform spatial distribution scenario of UEs is setup. Steps 3 to 7 are repeated.

37 5.10.4 Test requirements (expected results)

38 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
39 should be captured and reported in the test report for the performance assessment

40 UE side (real or emulated UE):

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 87
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

2 • PUSCH BLER, PUSCH MCS (average sample per second)

3 • Transmit power on PUSCH (average sample per second)

4 • Transmitted uplink throughput (Application layer) (average sample per second)

5 • Channel utilization, i.e. Number allocated/occupied uplink PRBs and Number of allocated/occupied slots
6 (average sample per second)

7 • GPS coordinates (latitude, longitude) in the field setup

8 eNB/gNB side (if capture of logs is possible):

9 • Radio parameters such as PUSCH SINR (average per second)

10 • PUSCH BLER (average per second)

11 Application server side:

12 • Received uplink throughput (L1 and Application layers) (average sample per second)

13 The Table 5-12 gives an example of the test results record (median and standard deviation from the captured samples
14 should be calculated for each metric). In case of 5G SA and NSA, SS-RSRP and SS-SINR should be reported. In case of
15 5G NSA and dual connectivity (EN-DC), the values should be provided separately for both LTE and 5G paths. The
16 spectral efficiency (see Section 3.8) should be calculated for benchmarking and comparison in order to minimize the
17 influence of different configured parameters such as bandwidth and TDD DL/UL ratio.

18 Table 5-12 Example record of test results (median and standard deviation from the captured samples)

For each UE (UE 1 to UE 10)


UDP / TCP
Spatial distribution scenario [uniform/non-uniform]
Received L1 UL throughput [Mbps]
Aggregated cell L1 UL throughput [Mbps] sum of Received L1 UL throughput (UE 1 to UE 10)
L1 UL Spectral efficiency [bps/Hz]
Received Application UL throughput [Mbps]
Aggregated cell App. UL throughput [Mbps] sum of Received App. UL throughput (UE 1 to UE 10)
UE RSRP [dBm]
UE PDSCH SINR [dB]
PUSCH transmit power [dBm]
PUSCH MSC
UL RB number
Aggregated UL RB number sum of UL RB number (UE 1 to UE 10)
PUSCH BLER [%]
19

20 In addition, the used spatial distribution scenario(s) should be properly described and depicted.

21 The following figures should be also included in the test report, and the stable behavior should be observed and evaluated.

22 • received total UDP/TCP uplink throughput at Application server (L1 and Application layers) vs Time duration

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 88
O-RAN.TIFG.E2E-Test.0-v04.00

1 For each UE:

2 • UE RSRP vs Time duration

3 • number of allocated/occupied uplink PRBs and Number of allocated/occupied slots vs Time duration

4 5.11 Impact of fronthaul latency on downlink peak throughout

5 5.11.1 Test description and applicability

6 The purpose of the test is to evaluate the user peak downlink throughput as a function of the fronthaul transport latency
7 (i.e. one-way transmission delay between O-RU and O-DU at OpenFH (CUSM-plane interface), and to identify the
8 maximum applicable fronthaul transport latency. Since the fronthaul transport latency corresponds with the distance
9 (fiber length) between O-RU and O-DU, the results of test can be also used to identify the maximum distance (fiber
10 length) with an acceptable degradation of user peak downlink throughput.

11 5.11.2 Test setup and configuration

12 The network setup is single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with
13 stationary UE (real or emulated UE) placed in the excellent radio condition as defined in Section 3.6 - SINR should be
14 considered in case of downlink. Within the cell there should be only one active UE downloading data from the application
15 server. The application server should be placed as close as possible to the core/core emulator and connected to the
16 core/core emulator via transport link with sufficient capacity not limiting the expected data throughput. The test is suitable
17 for lab as well as field environment.
18 The reference measurement points of the fronthaul latency (R1/R4 – Transmit/Receive interface at O-DU (CU-plane);
19 R2/R3 – Receive/Transmit interfaces at O-RU (CU-plane)) [16] are shown in Figure 5-4. Transmission delay between
20 O-RU and O-DU are specified as T12 (downlink direction) and T34 (uplink direction). Transmission delay should be
21 symmetrical and equal in both directions. The transmission delay encompasses only the time from when a bit leaves the
22 sender (R1/ R3) until it is received at the receiver (R2/ R4).
23

24

25 Figure 5-4 Definition of reference measurement points for fronthaul latency [16]
26

27 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
28 recorded in the test report.

29 Laboratory setup: The radio conditions of UE can be modified by a variable attenuator/fading generator inserted between
30 the antenna connectors of O-RU and UE. The minimum attenuation of radio signal should be set to achieve the excellent
31 radio conditions (SINR as defined in Section 3.6), but the minimum coupling loss (see Section 3.6) should not be
32 exceeded. The UE should be placed inside RF shielded box or RF shielded room if the UE is not connected via cable.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 89
O-RAN.TIFG.E2E-Test.0-v04.00

1 Field setup: The UE is placed in the centre of cell close to the radiated eNB/gNB antenna(s), where excellent radio
2 conditions (SINR as defined in Section 3.6) should be observed. The minimum coupling loss (see Section 3.6) should not
3 be exceeded.

4 In both laboratory and field setups, Figure 5-5, the fronthaul latency can be modified by using of various lengths of fibre
5 (assuming typical delay of fibre around 5 us per kilometre) or by using of a network impairment emulator inserted between
6 O-RU and O-DU. It is necessary to know/measure the transmission delays produced by all fronthaul transport components
7 between O-RU and O-DU in order to properly estimate the total fronthaul latency (T12/T34).

Application server
Uu Inserted 4G S1 3GPP Core 3GPP Other
UE 4G O-RU
latency O-DU/CU 4G EPC services services

OpenFH latency

O-RAN System under test

Uu Inserted 4G S1
4G O-RU
latency O-DU/CU
Application server

OpenFH latency X2 3GPP Core 3GPP Other


UE
4G EPC EN-DC NSA services services

Uu Inserted 5G S1
5G O-RU
latency O-DU/O-CU

O-RAN System under test

Application server
Uu Inserted 5G NG 3GPP Core 3GPP Other
UE 5G O-RU
latency O-DU/O-CU 5G SA services services

OpenFH latency

O-RAN System under test

8 Figure 5-5 The fronthaul transport latency test setups of 4G, 5G NSA and 5G SA

9 5.11.3 Test Procedure

10 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
11 report. The serving cell under test is activated and unloaded. All other cells are turned off.

12 2. The UE (real or emulated) is placed in the excellent radio condition (cell centre) as defined by SINR in Section 3.6.
13 The UE is powered on and attached to the network

14 3. The downlink full-buffer UDP and TCP data transmission (see Section 3.4) from the application server should be
15 verified. The excellent radio conditions experiencing peak user throughput is identified with stable utilization of the
16 highest possible downlink MCS, downlink transport block size and downlink MIMO rank (number of layers). The
17 utilization of these KPIs should be also verified.

18 4. The minimum fronthaul latency (one-way transmission delay between O-DU and O-RU) should be set using a
19 network impairment emulator or a fibre with corresponding length (assuming typical delay of fibre around 5 us per
20 kilometre).

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 90
O-RAN.TIFG.E2E-Test.0-v04.00

1 5. The UEs should be turned off or set to airplane mode, if possible, to empty the buffers. The downlink full-buffer UDP
2 data transmission from the application server to the UE is started. The UE should receive the data from application
3 server.

4 6. All the required performance data (incl. the signalling and control data) as specified in the following “Test
5 requirements” section is measured and captured at UE and Application server using logging/measurement tools.

6 7. The capture of log data is stopped. The downlink full-buffer UDP data transmission from the UE to application server
7 is stopped.

8 8. The fronthaul latency is increased by 20 us if no degradation of user peak downlink throughput was observer in the
9 previous measurement. As soon as a degradation of user peak downlink throughout will be observed, the fronthaul
10 latency is increased only by 5 us in order to capture fine-grained log data.

11 9. Steps 5 to 8 are repeated until the total degradation of user peak downlink throughput is less than 30%. The KPIs
12 measured with the minim fronthaul latency are used as a baseline (100%) for calculation of the degradation.

13 10. [Optional] Steps 4 to 9 are repeated for downlink full-buffer TCP data transmission.

14 5.11.4 Test requirements (expected results)

15 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
16 should be recorded and reported for the performance assessment.
17
18 UE side (real or emulated UE):

19 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

20 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

21 • Received downlink throughput (L1, and Application layers) (average sample per second)

22 • Downlink transmission mode

23 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
24 (average sample per second)

25 • GPS coordinates (latitude, longitude) in the field setup

26 Application server side:

27 • Transmitted downlink throughput (Application layer) (average sample per second)

28 When the UE is in the excellent radio conditions (cell centre), the stable utilization of the highest possible downlink MCS,
29 downlink transport block size and downlink MIMO rank should be observed and evaluated. The UE should also receive
30 the data with the minimum downlink BLER.
31
32 The Table 5-13 gives an example of the test results record (median and standard deviation from the captured samples
33 should be calculated for each metric). In case of 5G SA and NSA, SS-RSRP and SS-SINR should be reported. In case of
34 5G NSA and dual connectivity (EN-DC), the values should be provided separately for both LTE and 5G.
35
36 Table 5-13 The example of record of test results (median and standard deviation from the captured samples)

For each measured fronthaul latency value


UDP / TCP

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 91
O-RAN.TIFG.E2E-Test.0-v04.00

Total fronthaul transport latency (T12/T34) [us]


Received L1 DL throughput [Mbps]
Degradation of Received L1 DL throughput [%]#
Received Application DL throughput [Mbps]
UE RSRP [dBm]
UE RSRQ [dB]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MSC
DL PRB number
PDSCH BLER [%]
1 # The “Received L1 DL throughput” measured with the minim fronthaul latency is used as a baseline (100%) for calculation of the degradation

2 The following figures should be also included in the test report.

3 • Received UDP/TCP downlink throughput (L1 and Application layers) vs Total fronthaul latency (T12/T34)

4 5.12 Impact of fronthaul latency on uplink peak throughout

5 5.12.1 Test description and applicability

6 The purpose of the test is to evaluate the user peak uplink throughput as a function of the fronthaul transport latency (i.e.
7 one-way transmission delay between O-RU and O-DU at OpenFH (CUSM-plane interface), and to identify the maximum
8 applicable fronthaul transport latency. Since the fronthaul transport latency corresponds with the distance (fiber length)
9 between O-RU and O-DU, the results of test can be also used to identify the maximum distance (fiber length) with an
10 acceptable degradation of user peak uplink throughput.

11 5.12.2 Test setup and configuration

12 The network setup is single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with
13 stationary UE (real or emulated UE) placed in the excellent radio condition as defined in Section 3.6 - RSRP should be
14 considered in case of uplink. Within the cell there should be only one active UE uploading data to the application server.
15 The application server should be placed as close as possible to the core/core emulator and connected to the core/core
16 emulator via transport link with sufficient capacity not limiting the expected data throughput. The test is suitable for lab
17 as well as field environment.
18 The reference measurement points of the fronthaul latency (R1/R4 – Transmit/Receive interface at O-DU (CU-plane);
19 R2/R3 – Receive/Transmit interfaces at O-RU (CU-plane)) [16] are shown in Figure 5-4. Transmission delay between
20 O-RU and O-DU are specified as T12 (downlink direction) and T34 (uplink direction). Transmission delay should be
21 symmetrical and equal in both directions. The transmission delay encompasses only the time from when a bit leaves the
22 sender (R1/ R3) until it is received at the receiver (R2/ R4).

23
24 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
25 recorded in the test report.

26 Laboratory setup: The radio conditions of UE can be modified by a variable attenuator/fading generator inserted between
27 the antenna connectors of O-RU and UE. The minimum attenuation of radio signal should be set to achieve the excellent

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 92
O-RAN.TIFG.E2E-Test.0-v04.00

1 radio conditions (RSRR as defined in Section 3.6), but the minimum coupling loss (see Section 3.6) should not be
2 exceeded. The UE should be placed inside RF shielded box or RF shielded room if the UE is not connected via cable.

3 Field setup: The UE is placed in the centre of cell close to the radiated eNB/gNB antenna(s), where excellent radio
4 conditions (RSRP as defined in Section 3.6) should be observed. The minimum coupling loss (see Section 3.6) should
5 not be exceeded.

6 In both laboratory and field setups, Figure 5-5, the fronthaul latency can be modified by using of various lengths of fibre
7 (assuming typical delay of fibre around 5 us per kilometre) or by using of a network impairment emulator inserted between
8 O-RU and O-DU. It is necessary to know/measure the transmission delays produced by all fronthaul transport components
9 between O-RU and O-DU in order to properly estimate the total fronthaul latency (T12/T34).

10 5.12.3 Test Procedure

11 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
12 report. The serving cell under test is activated and unloaded. All other cells are turned off.

13 2. The UE (real or emulated) is placed in the excellent radio condition (cell centre) as defined by RSRP in Section 3.6.
14 The UE is powered on and attached to the network

15 3. The uplink full-buffer UDP and TCP data transmission (see Section 3.4) from UE to the application server should be
16 verified. The excellent radio conditions experiencing peak user throughput is identified with stable utilization of the
17 highest possible uplink MCS and uplink transport block size. The utilization of these KPIs should be also verified.

18 4. The minimum fronthaul latency (one-way transmission delay between O-DU and O-RU) should be set using a
19 network impairment emulator or a fibre with corresponding length (assuming typical delay of fibre around 5 us per
20 kilometre).

21 5. The UEs should be turned off or set to airplane mode, if possible, to empty the buffers. The uplink full-buffer UDP
22 data transmission from UE to the application server is started. The application server should receive data from UE.

23 6. All the required performance data (incl. the signalling and control data) as specified in the following “Test
24 requirements” section is measured and captured at UE, eNB/gNB and Application server using logging/measurement
25 tools.

26 7. The capture of log data is stopped. The uplink full-buffer UDP data transmission from application server to UE is
27 stopped.

28 8. The fronthaul latency is increased by 20 us if no degradation of user peak uplink throughput was observer in the
29 previous measurement. As soon as a degradation of user peak uplink throughout will be observed, the fronthaul
30 latency is increased only by 5 us in order to capture fine-grained log data.

31 9. Steps 5 to 8 are repeated until the total degradation of user peak uplink throughput is less than 30%. The KPIs
32 measured with the minim fronthaul latency are used as a baseline (100%) for calculation of the degradation.

33 10. [Optional] Steps 4 to 9 are repeated for uplink full-buffer TCP data transmission.

34 5.12.4 Test requirements (expected results)

35 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
36 should be recorded and reported for the performance assessment.
37
38 UE side (real or emulated UE):

39 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 93
O-RAN.TIFG.E2E-Test.0-v04.00

1 • PUSHC BLER, PUSCH MCS (average sample per second)

2 • Transmit power on PUSCH (average sample per second)

3 • Transmitted uplink throughput (Application layer) (average sample per second)

4 • Channel utilization, i.e. Number allocated/occupied uplink PRBs and Number of allocated/occupied slots
5 (average sample per second)

6 • GPS coordinates (latitude, longitude) in the field setup

7 eNB/gNB side (if capture of logs is possible):

8 • Radio parameters such as PUSCH SINR (average per second)

9 • PUSCH BLER (average per second)

10 Application server side:

11 • Received uplink throughput (L1 and Application layers) (average sample per second)

12
13 When the UE is in excellent radio condition (cell centre), the stable utilization of the highest possible uplink MCS and
14 uplink transport block size should be observed and evaluated. The eNB/gNB should also receive the data with the
15 minimum uplink BLER.

16 The Table 5-14 gives an example of the test results record (median and standard deviation from the captured samples
17 should be calculated for each metric). In case of 5G SA and NSA, SS-RSRP and SS-SINR should be reported. In case of
18 5G NSA and dual connectivity (EN-DC), the values should be provided separately for both LTE and 5G.

19
20 Table 5-14 The example of record of test results (median and standard deviation from the captured samples)

For each measured fronthaul latency value


UDP / TCP
Total fronthaul transport latency (T12/T34) [us]
Received L1 UL throughput [Mbps]
Degradation of Received L1 UL throughput [%]#
Received Application UL throughput [Mbps]
UE RSRP [dBm]
UE RSRQ [dB]
UE PDSCH SINR [dB]
PUSCH transmit power [dBm]
PUSCH MSC
UL PRB number
PUSCH BLER [%]
21 # The “Received L1 UL throughput” measured with the minim fronthaul latency is used as a baseline (100%) for calculation of the degradation

22 The following figures should be also included in the test report.

23 • Received UDP/TCP uplink throughput (L1 and Application layers) vs Total fronthaul latency (T12/T34)

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 94
O-RAN.TIFG.E2E-Test.0-v04.00

1 5.13 Impact of midhaul latency on downlink peak throughout

2 5.13.1 Test description and applicability

3 The purpose of the test is to evaluate the user peak downlink throughput as a function of the midhaul transport latency
4 (i.e. one-way transmission delay between O-DU and O-CU at F1 interface). Since the midhaul transport latency
5 corresponds with the distance (fiber length) between O-DU and O-CU, the results of test can be also used to identify the
6 maximum distance (fiber length) with an acceptable degradation of user peak downlink throughput. Note that the midhaul
7 transport exists only if O-DU and O-CU are not a combined entity, and the interface F1 between O-DU and O-CU is
8 exposed. The test does not support LTE network architecture because an interface providing means for interconnecting
9 LTE O-DU/DU and LTE CU has not been specified yet neither in 3GPP nor in O-RAN ALLIANCE. Note that W1
10 interface [33] has been specified only for 5G NSA Options 4 and 7 network architectures. The test is suitable only for 5G
11 NSA (Option 3/3a/3x) and 5G SA network architectures as defined in Chapter 3.

12 5.13.2 Test setup and configuration

13 The network setup is single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with
14 stationary UE (real or emulated UE) placed in the excellent radio condition as defined in Section 3.6 - SINR should be
15 considered in case of downlink. Within the cell there should be only one active UE downloading data from the application
16 server. The application server should be placed as close as possible to the core/core emulator and connected to the
17 core/core emulator via transport link with sufficient capacity not limiting the expected data throughput. The test is suitable
18 for lab as well as field environment.
19 The midhaul latency encompasses only the time from when a bit leaves O-CU until it is received at O-DU.

20
21 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
22 recorded in the test report.

23 Laboratory setup: The radio conditions of UE can be modified by a variable attenuator/fading generator inserted between
24 the antenna connectors of O-RU and UE. The minimum attenuation of radio signal should be set to achieve the excellent
25 radio conditions (SINR as defined in Section 3.6), but the minimum coupling loss (see Section 3.6) should not be
26 exceeded. The UE should be placed inside RF shielded box or RF shielded room if the UE is not connected via cable.

27 Field setup: The UE is placed in the centre of cell close to the radiated eNB/gNB antenna(s), where excellent radio
28 conditions (SINR as defined in Section 3.6) should be observed. The minimum coupling loss (see Section 3.6) should not
29 be exceeded.

30 In both laboratory and field setups, Figure 5-6, the midhaul transport latency can be modified by using of a network
31 impairment emulator inserted between O-DU and O-CU. It is necessary to know/measure the transmission delays
32 produced by all midhaul transport components between O-DU and O-CU in order to properly estimate the total midhaul
33 latency.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 95
O-RAN.TIFG.E2E-Test.0-v04.00

Uu OpenFH 4G S1
4G O-RU
O-DU/CU
Application server

Midhaul latency X2 3GPP Core 3GPP Other


UE
4G EPC EN-DC NSA services services

Uu 5G F1 Inserted F1 S1
5G O-CU
O-RU/O-DU latency

O-RAN System under test

Application server
Uu 5G F1 Inserted F1 NG 3GPP Core 3GPP Other
UE 5G O-CU
O-RU/O-DU latency 5G SA services services

Midhaul latency

O-RAN System under test

1 Figure 5-6 The midhaul transport latency test setups of 5G NSA and 5G SA

2 5.13.3 Test Procedure

3 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
4 report. The serving cell under test is activated and unloaded. All other cells are turned off.

5 2. The UE (real or emulated) is placed in the excellent radio condition (cell centre) as defined by SINR in Section 3.6.
6 The UE is powered on and attached to the network

7 3. The downlink full-buffer UDP and TCP data transmission (see Section 3.4) from the application server should be
8 verified. The excellent radio conditions experiencing peak user throughput is identified with stable utilization of the
9 highest possible downlink MCS, downlink transport block size and downlink MIMO rank (number of layers). The
10 utilization of these KPIs should be also verified.

11 4. The minimum midhaul latency (one-way transmission delay between O-DU and O-CU) should be set using a network
12 impairment emulator.

13 5. The UEs should be turned off or set to airplane mode, if possible, to empty the buffers. The downlink full-buffer UDP
14 data transmission from the application server to UE is started. The UE should receive the data from application server.

15 6. All the required performance data (incl. the signalling and control data) as specified in the following “Test
16 requirements” section is measured and captured at UE and Application server using logging/measurement tools.

17 7. The capture of log data is stopped. The downlink full-buffer UDP data transmission from application server is stopped.

18 8. The midhaull latency is increased by 1000 us if no degradation of user peak downlink throughput was observer in the
19 previous measurement. As soon as a degradation of user peak downlink throughout will be observed, the midhaul
20 latency is increased only by 250 us in order to capture fine-grained log data.

21 9. Steps 5 to 8 are repeated until the total degradation of user peak downlink throughput is less than 30%. The KPIs
22 measured with the minim fronthaul latency are used as a baseline (100%) for calculation of the degradation.

23 10. [Optional] Steps 4 to 9 are repeated for downlink full-buffer TCP data transmission.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 96
O-RAN.TIFG.E2E-Test.0-v04.00

1 5.13.4 Test requirements (expected results)

2 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
3 should be recorded and reported for the performance assessment.
4
5 UE side (real or emulated UE):

6 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

7 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

8 • Received downlink throughput (L1, and Application layers) (average sample per second)

9 • Downlink transmission mode

10 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
11 (average sample per second)

12 • GPS coordinates (latitude, longitude) in the field setup

13 Application server side:

14 • Transmitted downlink throughput (Application layer) (average sample per second)

15 When the UE is in the excellent radio conditions (cell centre), the stable utilization of the highest possible downlink MCS,
16 downlink transport block size and downlink MIMO rank should be observed and evaluated. The UE should also receive
17 the data with the minimum downlink BLER.
18
19 The Table 5-15 gives an example of the test results record (median and standard deviation from the captured samples
20 should be calculated for each metric). In case of 5G SA and NSA, SS-RSRP and SS-SINR should be reported. In case of
21 5G NSA and dual connectivity (EN-DC), the values should be provided separately for both LTE and 5G.
22
23 Table 5-15 The example of record of test results (median and standard deviation from the captured samples)

For each measured midhaul latency value


UDP / TCP
Total midhaul transport latency [us]
Received L1 DL throughput [Mbps]
Degradation of Received L1 DL throughput [%]#
Received Application DL throughput [Mbps]
UE RSRP [dBm]
UE RSRQ [dB]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MSC
DL PRB number
PDSCH BLER [%]
24 # The “Received L1 DL throughput” measured with the minim fronthaul latency is used as a baseline (100%) for calculation of the degradation

25 The following figures should be also included in the test report.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 97
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Received UDP/TCP downlink throughput (L1 and Application layers) vs Total midhaul latency

2 5.14 Impact of midhaul latency on uplink peak throughout

3 5.14.1 Test description and applicability

4 The purpose of the test is to evaluate the user peak uplink throughput as a function of the midhaul transport latency (i.e.
5 one-way transmission delay between O-DU and O-CU at F1 interface). Since the midhaul transport latency corresponds
6 with the distance (fiber length) between O-DU and O-CU, the results of test can be also used to identify the maximum
7 distance (fiber length) with an acceptable degradation of user peak uplink throughput. Note that the midhaul transport
8 exists only if O-DU and O-CU are not a combined entity, and the interfaces between O-DU and O-CU are exposed. The
9 test does not support LTE network architecture because an interface providing means for interconnecting LTE O-DU/DU
10 and LTE CU has not been specified yet neither in 3GPP nor in O-RAN ALLIANCE. Note that W1 interface [33] has been
11 specified only for 5G NSA Options 4 and 7 network architectures. The test is suitable only for 5G NSA (Option 3/3a/3x)
12 and 5G SA network architectures as defined in Chapter 3.

13 5.14.2 Test setup and configuration

14 The network setup is single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with
15 stationary UE (real or emulated UE) placed in the excellent radio condition as defined in Section 3.6 - RSRP should be
16 considered in case of uplink. Within the cell there should be only one active UE uploading data to application server. The
17 application server should be placed as close as possible to the core/core emulator and connected to the core/core emulator
18 via transport link with sufficient capacity not limiting the expected data throughput. The test is suitable for lab as well as
19 field environment.
20 The midhaul latency encompasses only the time from when a bit leaves O-DU until it is received at O-CU.

21
22 Test configuration: The test configuration is not specified. The utilized test configuration (parameters) should be
23 recorded in the test report.

24 Laboratory setup: The radio conditions of UE can be modified by a variable attenuator/fading generator inserted between
25 the antenna connectors of O-RU and UE. The minimum attenuation of radio signal should be set to achieve the excellent
26 radio conditions (RSRP as defined in Section 3.6), but the minimum coupling loss (see Section 3.6) should not be
27 exceeded. The UE should be placed inside RF shielded box or RF shielded room if the UE is not connected via cable.

28 Field setup: The UE is placed in the centre of cell close to the radiated eNB/gNB antenna(s), where excellent radio
29 conditions (RSRP as defined in Section 3.6) should be observed. The minimum coupling loss (see Section 3.6) should
30 not be exceeded.

31 In both laboratory and field setups, Figure 5-6, the midhaul transport latency can be modified by using of a network
32 impairment emulator inserted between O-DU and O-CU. It is necessary to know/measure the transmission delays
33 produced by all midhaul transport components between O-DU and O-CU in order to properly estimate the total midhaul
34 latency.

35 5.14.3 Test Procedure

36 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
37 report. The serving cell under test is activated and unloaded. All other cells are turned off.

38 2. The UE (real or emulated) is placed in the excellent radio condition (cell centre) as defined by RSRP in Section 3.6.
39 The UE is powered on and attached to the network

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 98
O-RAN.TIFG.E2E-Test.0-v04.00

1 3. The uplink full-buffer UDP and TCP data transmission (see Section 3.4) from UE to the application server should be
2 verified. The excellent radio conditions experiencing peak user throughput is identified with stable utilization of the
3 highest possible uplink MCS and uplink transport block size. The utilization of these KPIs should be also verified.

4 4. The minimum midhaul latency (one-way transmission delay between O-DU and O-CU) should be set using a network
5 impairment emulator.

6 5. The UEs should be turned off or set to airplane mode, if possible, to empty the buffers. The uplink full-buffer UDP
7 data transmission from UE to application server is started. The application server should receive the data from UE.

8 6. All the required performance data (incl. the signalling and control data) as specified in the following “Test
9 requirements” section is measured and captured at UE, eNB/gNB and Application server using logging/measurement
10 tools.

11 7. The capture of log data is stopped. The uplink full-buffer UDP data transmission from the UE to application server
12 is stopped.

13 8. The midhaull latency is increased by 1000 us if no degradation of user peak uplink throughput was observer in the
14 previous measurement. As soon as a degradation of user peak uplink throughout will be observed, the midhaul latency
15 is increased only by 250 us in order to capture fine-grained log data.

16 9. Steps 5 to 8 are repeated until the total degradation of user peak uplink throughput is less than 30%. The KPIs
17 measured with the minim fronthaul latency are used as a baseline (100%) for calculation of the degradation.

18 10. [Optional] Steps 4 to 9 are repeated for uplink full-buffer TCP data transmission.

19 5.14.4 Test requirements (expected results)

20 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
21 should be recorded and reported for the performance assessment.
22
23 UE side (real or emulated UE):

24 • Radio parameters such as RSRP, RSRQ, CQI, PDSCH SINR (average sample per second)

25 • PUSHC BLER, PUSCH MCS (average sample per second)

26 • Transmit power on PUSCH (average sample per second)

27 • Transmitted uplink throughput (Application layer) (average sample per second)

28 • Channel utilization, i.e. Number allocated/occupied uplink PRBs and Number of allocated/occupied slots
29 (average sample per second)

30 • GPS coordinates (latitude, longitude) in the field setup

31 eNB/gNB side (if capture of logs is possible):

32 • Radio parameters such as PUSCH SINR (average per second)

33 • PUSCH BLER (average per second)

34 Application server side:

35 • Received uplink throughput (L1 and Application layers) (average sample per second)

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 99
O-RAN.TIFG.E2E-Test.0-v04.00

1
2 When the UE is in excellent radio condition (cell centre), the stable utilization of the highest possible uplink MCS and
3 uplink transport block size should be observed and evaluated. The eNB/gNB should also receive the data with the
4 minimum uplink BLER.

5 The Table 5-16 gives an example of the test results record (median and standard deviation from the captured samples
6 should be calculated for each metric). In case of 5G SA and NSA, SS-RSRP and SS-SINR should be reported. In case of
7 5G NSA and dual connectivity (EN-DC), the values should be provided separately for both LTE and 5G.
8
9 Table 5-16 The example of record of test results (median and standard deviation from the captured samples)

For each measured midhaul latency value


UDP / TCP
Total midhaul transport latency [us]
Received L1 UL throughput [Mbps]
Degradation of Received L1 UL throughput [%]#
Received Application UL throughput [Mbps]
UE RSRP [dBm]
UE RSRQ [dB]
UE PDSCH SINR [dB]
PUSCH transmit power [dBm]
PDSCH MSC
UL PRB number
PUSCH BLER [%]
10 # The “Received L1 UL throughput” measured with the minim fronthaul latency is used as a baseline (100%) for calculation of the degradation

11 The following figures should be also included in the test report.

12 • Received UDP/TCP uplink throughput (L1 and Application layers) vs Total midhaul latency

13

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 100
O-RAN.TIFG.E2E-Test.0-v04.00

1 Services
2 As a part of O-RAN testing, we need to ensure the O-RAN system can work in a telecom network, by inter-working with
3 the other sub-systems to provide End-to-End services. This section of the document outlines the services which needs to
4 be tested to validate that the O-RAN system can be deployed and optimized to deliver great end user service experience
5 in a telecom network.

6 The services which need to be tested are broadly classified as data service, video streaming service, voice service, video
7 calling service in the eMBB slice and services supported using other slices like URLLC and mMTC. This section of the
8 document also tests different scenarios like handover and different radio conditions to assess the impact of these scenarios
9 on the end user service experience.

10 6.1 Data Services


11 Data services forms one of the basic services supported as of today on a telecom network. This includes everything
12 from web browsing, uploading/downloading content to traffic generated by all the different applications on the end user
13 device. These services form an integral part of data traffic on a telecom network. This section comprises of two test
14 scenarios which include web browsing and file upload/download.

15 Along with the monitoring and validation of these services using user experience KPIs, the O-RAN systems also need
16 to be monitored. The end user service experience can also be impacted by some of the features available on the O-RAN
17 system. Some of these features have been included below. As a part of the data services testing, details of these features
18 need to be included in the test report to get a comprehensive view of the setup used for testing. If additional features or
19 functionalities have been enabled during this testing that impact the end user experience, those need be included in the
20 test report as well.

21 • Control Channel Beamforming


22 • Connected Mode DRX
23 • Massive MIMO
24 • Coordinated Multi Point (DL and UL)
25 • 256QAM Support (DL and UL)
26 • SSB Power Boost
27 • Beam Management
28 • DL Common Channel Beamforming
29 • Single-User MIMO Beamforming (DL and UL)
30 • Multi-User MIMO Beamforming (DL and UL)
31 • Single-User MIMO, TM Switching
32 • Multi-User MIMO, TM Switching
33 • TDD configuration support
34 • RAN Slicing Framework (NR Low/Mid/High)
35 • RACH Enhancements (PRACH Format 0)
36 • PF (Proportional Fairness) Scheduling
37 • QoS Scheduling
38 • Minimal Bit Rate Scheduling

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 101
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Link Adaptation (DL and UL)


2 • Uplink Data Compression UDC
3 • PUSCH Frequency Hopping
4 Please see below summary table of test cases and applicable technology.

5 Table 6-1: Data Service Test Case summary

Applicable technology

NSA
LTE

SA
Test case Functional group

Test
Data service
ID
6.1.1 Web Browsing Service Y Y Y
6.1.2 File upload/download Service Y Y Y
6
7

8 6.1.1 Web Browsing

9 6.1.1.1 Test Description

10 Web browsing forms an integral part of the 4G and 5G data network, and this test case is applicable to both NSA and
11 SA deployments. This testing will need to be performed twice, once for NSA and repeated again for SA deployment as
12 applicable. HTTP is the protocol used for web browsing and this protocol can be transported over TCP/TLS or
13 UDP/QUIC. As a part of this effort, we will include scenarios which will include both these protocols. Testing can be
14 performed using HTTP/TLS/TCP or HTTP/QUIC/UDP or both these protocols as applicable.

15 Web browsing KPI

16 • DNS Resolution time– Time taken from when the client sends a DNS query to when the DNS responds
17 with an IP address in milliseconds/seconds. This KPI should be recorded if DNS is used.
18 • Time To First Byte (TTFB) – Time taken from when the client makes the HTTP request to when the first
19 byte of the page is received in milliseconds/seconds.
20 • Page Load Time – Time taken from when the client places the request to when the page is completely
21 loaded in seconds.

22 6.1.1.2 Test Setup

23 The SUT in this test case would be O-eNB along with O-RU, O-DU and O-CU-CP/O-CU-UP for NSA deployments or
24 just the O-RU, O-DU and O-CU-CP/O-CU-UP for SA deployments. The O-RAN setup should support the ability to
25 perform this testing in different radio conditions as defined in Section 3.6. The 4G/5G core will be required to support
26 the basic functionality to authenticate and register an end user device in order to setup a PDN connection/PDU session.
27 The 4G core will be used for O-RAN NSA testing, whereas 5G core will be used for O-RAN SA testing. The 4G/5G
28 core could be a completely emulated, partially emulated or a real non-emulated core. The application server should
29 support web browsing and be accessible from the 4G/5G core. This application server(s) should support protocols like
30 HTTP/TCP and/or HTTP/QUIC as applicable. The end user device (UE) used for testing could be a real UE or an
31 emulated one. The test setup should include tools which have the ability to collect traces on the elements and/or packet
32 captures of communication between the elements. This could be a built-in capability of the emulated/non-emulated
33 network elements and end user device or an external tool. Optionally, if some of the network elements or application

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 102
O-RAN.TIFG.E2E-Test.0-v04.00

1 server is located remotely either in a cloud or on the internet, the additional latency should be calculated and accounted
2 for.

3 The O-eNB, O-RU, O-DU and O-CU-CP/O-CU-UP need to have the right configuration and software load. Tools
4 which emulate latency need to be used and configured on the O-RAN system (between O-RU, O-DU and O-CU-CP/O-
5 CU-UP) to emulate real-world deployment conditions as applicable. The end user device must be configured with the
6 right user credentials to be able to register and authenticate with the O-RAN system and the 4G/5G core. The end user
7 device also needs to be provisioned with the right application like the web browser to perform the tests. The 4G/5G core
8 network must be configured to support end user device used for testing. This includes supporting registration,
9 authentication and PDN connection/PDU session establishment for this end user device. The application server should
10 be configured to support web browsing, along with support for HTTP/TCP and/or HTTP/QUIC protocols as applicable.
11 Web browsing has lot of variables which can impact the KPIs and in order to reduce the variables and make the test
12 outcomes consistent, it is recommended to set up a static web page of ~1.8-2MB size. The size of the web page is
13 comparable to the average size of web pages on the internet. A static web page does not change the content on repeated
14 requests, thus making the test results consistent and repeatable. The locations where the radio conditions are excellent,
15 good, fair and poor need to be identified within the serving cell.

16 All the elements in the network like O-RAN system, 4G/5G core and the application server need to have the ability to
17 capture traces to validate the successful execution of the test cases. The end user device needs to have the capability to
18 capture traces/packet capture to calculate the web browsing KPIs. Optionally, the network could have network taps
19 deployed in various legs of the network to get packet captures to validate successful execution of the test cases. Finally,
20 all these different components need to have connectivity with each other – the end user device should be able to connect
21 to O-RAN system, which needs to be connected to the 4G/5G core which in turn should have connectivity to the
22 application server.

23 6.1.1.3 Test Methodology/Procedure

24 Ensure the end user device, O-RAN system, 4G/5G core and the application server have all been configured as outlined
25 in Section 6.1.1.2. All traces and packet captures need to be enabled for the duration of the testing to ensure all
26 communication between network elements can be captured and validated.

27 1. Power on the end user device in excellent radio condition and ensure it registers with the 4G/5G core for data
28 services.
29 2. Once the registration is complete, the end user device has to establish a PDU session with the 5G core (SA) or PDN
30 connection with the 4G core (NSA).
31 3. Open the browser or application on the end user device and start a web browsing session to a web content using
32 HTTP/TCP and/or HTTP/QUIC protocol. If the intent is to execute this test case for both protocols, then this step
33 needs to be performed twice, once for web content using HTTP/TCP followed by web content using HTTP/QUIC
34 protocol.
35 4. Validate the end user can get the content from the application server and view it on the end device. For every test
36 collect the KPIs included in Section 6.1.1.1.
37 5. Clear the browser cache on the end user device between tests.
38 6. Repeat the test multiple times (>10 times) and gather results.
39 7. Repeat the above steps 1 through 6 for the good, fair and poor radio conditions.
40

41 6.1.1.4 Test Expectation (expected results)

42 As a pre-validation, use the traces to validate a successful registration and PDN connection/PDU session setup by the
43 end user device without any errors. This is a prerequisite before these tests can be validated.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 103
O-RAN.TIFG.E2E-Test.0-v04.00

1 First and foremost, the end user device should be able to view the web browsing content for the web browsing test, with
2 the content being viewable and readable. Use the packet captures to validate there is no packet drop or out of sequence
3 packets which could impact customer experience and point to a misconfigured or flawed system. Any failures
4 encountered during testing, will have to be investigated to identify the root cause.

5 Calculate the web browsing KPIs included in Section 6.1.1.1 and include it in the test report. In a lab setup, the user of
6 DNS could be bypassed by using IP Addresses instead of domain names. However, the DNS resolution time KPI must
7 be recorded if a DNS is used for testing. The average range of values for these KPIs are included below for guidance,
8 taking into consideration the size of the web page (~ 2MB). These values are applicable when the testing is performed
9 in a controlled environment in good radio condition without the interference of external factors which could impact the
10 KPIs, example: use of internet to connect to remote servers/hosts could add additional latency and packet loss issues to
11 the connection, thus impacting the KPIs. As there are multiple variables which can impact the testing in this scenario, a
12 KPI outcome outside the range does not necessarily point to a failed test case. Any test results outside the range of KPI
13 encountered during testing, will have to be investigated to identify the root cause as the issues may be due to the
14 variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and the test has to be
15 repeated.

16 • DNS Resolution time (conditional mandatory) – < 1 second


17 • Time To First Byte (TTFB) – < 3 seconds
18 • Page Load Time – < 12 seconds
19

20 As a part of gathering test data and reporting, ensure the minimum configuration parameters (see Section 3.3) are
21 included in the test report. The following information should also be included in the test report to provide a
22 comprehensive view of the test setup.

23 End user device side (real or emulated):

24 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

25 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

26 • Received downlink throughput for the duration of the page download (L1 and L3 PDCP layers) (average sample
27 per second)

28 • Downlink transmission mode

29 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
30 (average sample per second)

31 The table below gives an example of the test report considering the mean and standard deviation of all test results that
32 have been captured.

33 Table 6-2 Example of Test Report for Web Browsing Testing

Excellent Poor
Good Fair
(cell centre) (cell edge)
HTTP over HTTP over HTTP over HTTP over
TCP/QUIC TCP/QUIC TCP/QUIC TCP/QUIC
DNS Resolution Time (Conditional
mandatory)
Time To First Byte (TTFB)
Page Load Time
L1 DL throughput [Mbps]
L1 DL Spectral efficiency [bps/Hz]

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 104
O-RAN.TIFG.E2E-Test.0-v04.00

L3 DL PDCP throughput [Mbps]


Application DL throughput [Mbps]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
1 The web browsing experience can also be impacted by some of the features available (see Section 6.1) on the O-eNB,
2 O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other features/functionality which
3 could impact the end user’s web browsing experience should be included in the test report to provide a comprehensive
4 view of the testing.

5 6.1.2 File upload/download

6 6.1.2.1 Test Description

7 File Transfer Protocol (FTP) is a simple application layer protocol used to transfer file between remote locations. FTP
8 along with different flavours of the protocols forms one of the fundamental methods to upload/download a file on the
9 internet. This test case is applicable to both NSA and SA deployments and will need to be performed twice, once for
10 NSA and repeated again for SA deployment as applicable. This scenario tests the end user experience to
11 upload/download files to/from an FTP server over an O-RAN system. The KPIs used to measure the user experience
12 have been included below

13 1. Download throughput – This is the average application layer throughput to download the file in kbps
14 2. Upload throughput – This is the average application layer throughput to upload the file in kbps.
15 3. Time taken to Download file – This is the time taken to download the file in seconds.
16 4. Time taken to Upload file – This is the time taken to upload the file in seconds.
17

18 6.1.2.2 Test Setup

19 The SUT in this test case would be O-eNB along with O-RU, O-DU and O-CU-CP/O-CU-UP for NSA deployments or
20 just the O-RU, O-DU and O-CU-CP/O-CU-UP for SA deployments. The O-RAN setup should support the ability to
21 perform this testing in different radio conditions as defined in Section 3.6. A 4G/5G core will be required with the basic
22 functionality to authenticate and register an end user device in order to setup a PDN connection/PDU session. The 4G
23 core will be used for O-RAN NSA testing, whereas 5G core will be used for O-RAN SA testing. The 4G/5G core be a
24 completely emulated, partially emulated or a real non-emulated core. An FTP server acts as an application server for
25 this test case. The end user device (UE) used for testing could be a real UE or an emulated one. The test setup should
26 include tools which have the ability to collect traces on the elements and/or packet captures of communication between
27 the elements. This could be a built-in capability of the emulated/non-emulated network elements or an external tool.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 105
O-RAN.TIFG.E2E-Test.0-v04.00

1 Optionally, if some of the network elements are located remotely either in a cloud or on the internet, the additional
2 latency should be calculated and accounted for.

3 The O-eNB, O-RU, O-DU and O-CU-CP/O-CU-UP need to have the right configuration and software load. Tools
4 which emulate latency need to be used and configured on the O-RAN system (between O-RU, O-DU and O-CU-CP/O-
5 CU-UP) to emulate real-world deployment conditions. The end user device must be configured with the right user
6 credentials to be able to register and authenticate with the O-RAN system and the 4G/5G core. The end user device also
7 needs to be provisioned with the right application like an FTP client to upload and download files. The 4G/5G core
8 network must be configured to support end user device used for testing. This includes supporting registration,
9 authentication and PDN connection/PDU session establishment for this end user device. The FTP server should host
10 large files (>1 GB) which can be used for the download/upload testing. The locations where the radio conditions are
11 excellent, good, fair and poor need to be identified within the serving cell.

12 All the elements in the network like O-RAN system, 4G/5G core and the application server need to have the ability to
13 capture traces to validate the successful execution of the test cases. The end user device needs to have the capability to
14 capture traces/packet capture to calculate the file upload/download KPIs. Optionally, the network could have network
15 taps deployed in various legs of the network to get packet captures to validate successful execution of the test cases.
16 Finally, all these different components need to have connectivity with each other – the end user device should be able to
17 connect to O-RAN system, which needs to be connected to the 4G/5G core which in turn should have connectivity to
18 the application server.

19 6.1.2.3 Test Methodology/Procedure

20 Ensure the end user device, O-RAN system, 4G/5G core and the FTP server have all been configured as outlined in
21 Section 6.1.2.2. All traces and packet captures need to be enabled for the duration of the testing to ensure all
22 communication between network elements can be captured and validated.

23 1. Power on the end user device in excellent radio condition and ensure it registers with the 4G/5G core for data
24 services.
25 2. Once the registration is complete, the end user device has to establish a PDU session with the 5G core (SA) or PDN
26 connection with the 4G core (NSA).
27 3. Open the application on the end user device and upload the test file to the FTP server. Make a note of the time
28 taken to upload the file and the average upload throughput.
29 4. Next use the same application on the end user device to download a different test file from the FTP server. Make a
30 note of the time taken to download the file and the average download throughput.
31 5. Clear all the buffers and caches on the client and the server. Delete the test files – downloaded file on the client and
32 the uploaded file on the FTP server.
33 6. Repeat the test multiple times (>10 times) and gather results.
34 7. Repeat the above steps 1 through 6 for the good, fair and poor radio conditions.
35

36 6.1.2.4 Test Expectation (expected results)

37 As a pre-validation, use the traces to validate a successful registration and PDN connection/PDU session setup by the
38 end user device without any errors. This is a prerequisite before these tests can be validated.

39 Validate the end user device can upload/download the complete file to/from the FTP server without interruption.
40 Validate there isn’t a drop in throughput while uploading/downloading the file to/from the FTP server. Use the packet
41 captures to validate there is no packet drop or out of sequence packets which could impact customer experience and
42 point to a misconfigured or flawed system.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 106
O-RAN.TIFG.E2E-Test.0-v04.00

1 Calculate the file upload/download KPIs included in Section 6.1.1.1 and include it in the test report. There are no target
2 values for these KPIs as they are dependent on multiple factors used in the test configuration. Some comments on each
3 of the KPIs are included below.

4 • Download throughput – This value should be comparable to the results of the downlink throughput test
5 performed in Section 5.2 and Section 5.4.
6 • Upload throughput – This value should be comparable to the results of the uplink throughput test performed in
7 Section 5.3 and Section 5.5.
8 • Time taken to Download file – This value should be comparable to the value calculated using the formula
9 included below
10 • Time taken to Upload file – This value should be comparable to the value calculated using the formula included
11 below.
12

Size of file in GB x 1000 x 1000 x 8


13 𝑡𝑖𝑚𝑒 𝑡𝑎𝑘𝑒𝑛 𝑡𝑜 𝑢𝑝𝑙𝑜𝑎𝑑/𝑑𝑜𝑤𝑛𝑙𝑜𝑎𝑑 [𝑠𝑒𝑐𝑜𝑛𝑑𝑠] =
𝑢𝑝𝑙𝑖𝑛𝑘/𝑑𝑜𝑤𝑛𝑙𝑖𝑛𝑘/𝑢𝑝𝑙𝑖𝑛𝑘 𝑡ℎ𝑟𝑜𝑢𝑔ℎ𝑝𝑢𝑡 [𝑘𝑏𝑝𝑠]

14 These KPI values included in Section 6.1.2.1 need to be included in the test report along with the minimum
15 configuration parameters included in Section 3.3. The following information should also be included in the test report to
16 provide a comprehensive view of the test setup.

17 End user device side (real or emulated UE):

18 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

19 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

20 • Received downlink throughput (L1 and L3 PDCP layers) (average sample per second)

21 • Downlink transmission mode

22 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
23 (average sample per second)

24 The table below gives an example of the test report considering the mean and standard deviation of all test results that
25 have been captured.

26 Table 6-3 Example Test Report for File Upload/Download Testing

Excellent Poor
Good Fair
(cell centre) (cell edge)
File File File File
Upload/Download Upload/Download Upload/Download Upload/Download
Upload/Download Throughput (kbps)
Time taken to Upload/Download File
L1 DL throughput [Mbps]
L1 DL Spectral efficiency [bps/Hz]
L3 DL PDCP throughput [Mbps]
Application DL throughput [Mbps]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 107
O-RAN.TIFG.E2E-Test.0-v04.00

DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
1 The file upload/download experience can also be impacted by some of the features available (see Section 6.1) on the O-
2 eNB, O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other features/functionality
3 which could impact the end user’s web browsing experience should be included in the test report to provide a
4 comprehensive view of the testing.

5 6.2 Video Streaming


6 Video makes up a major part of the internet traffic today and we are seeing similar trend even on the mobile data traffic.
7 Mobile video accounts for close to two-thirds of the total mobile data traffic and is expected to increase in the coming
8 years. Video streaming has evolved over time with newer and better audio and video codecs. The protocols used to
9 stream the video packets has also been evolving over time, with Adaptive Bit Rate (ABR) being the most common
10 protocol used for streaming video on the internet today. There are multiple flavours of ABR, but at the crux they all use
11 HTTP protocol over TCP/TLS or UPD/QUIC to transfer audio-video packets to a client. A video streaming server
12 which supports ABR hosts multiple versions of the same video content with each version of the video being encoded at
13 different resolution and quality, hence different bit rates. The client uses the ABR protocol to get the list of all available
14 versions of the requested video content and picks the best video content based on the available bandwidth on the client
15 side. The ABR client continuously monitors the network condition and dynamically adjusts the quality and resolution of
16 the video stream to match the available bandwidth.

17 The ABR protocol provides the user with the best video streaming experience based on the available bandwidth
18 between the client and the content server. This however adds a lot of variables to the streaming session

19 • An ABR client may start with a lower resolution video and progressively switch to higher resolution video as it
20 better estimates the bandwidth available at the client.
21 • An ABR client might notice an improvement in the bandwidth and request a higher resolution/quality video
22 content, thus improving the video quality mid-stream.
23 • An ABR client might notice a degradation in the bandwidth and request a lower resolution/quality video
24 content, thus deteriorating the video quality mid-stream.
25

26 These variables make it challenging to quantify the video streaming experience of an end user. There have been many
27 tools and organization which have defined different methods and algorithms to quantify the end user experience
28 including QoE (Quality of Experience), SVQ (Streaming Video Quality), Video Multimethod Assessment Fusion
29 (VMAF) etc. In this document we recommend using the Mean Opinion Score (MOS) as defined by ITU P.1203.3 to
30 quantify the quality of experience of the end user. This mechanism however limits the testing to H264 video encoder
31 with a resolution of HD quality (1080p resolution – 1920 x 1080 pixels) or below.

32 Video Streaming KPIs

33 • Video start time or Time to load first video frame – Time from when the video was selected to play to
34 when the video starts playing in seconds.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 108
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Number of video stalls/buffering (Optional) – Number of times the video stalled or started buffering during
2 the course of video streaming. This KPI has already been considered by ITU P.1203.3 to provide a
3 cumulative MOS score.
4 • Duration of stalls in the video (Optional) – The cumulative duration of all the stalls during the course of
5 video streaming in seconds. This KPI has already been considered by ITU P.1203.3 to provide a
6 cumulative MOS score.
7 • Video MOS score – MOS score for the video streaming session as defined by ITU P1203.3.
8

9 Along with the monitoring and validation of these services using user experience KPIs, the O-RAN systems also need
10 to be monitored. The end user service experience can also be impacted by some of the features available on the O-RAN
11 system. Some of these features have been included below. As a part of the video streaming testing, details of these
12 features need to be included in the test report to get a comprehensive view of the setup used for testing. If additional
13 features or functionalities have been enabled during this testing that impact the end user experience, those need be
14 included in the test report as well.

15 • NR to LTE PS Redirection/Cell Reselection/Handover


16 • Intra-frequency / Inter-Frequency Cell Reselection/Handover
17 • NR Coverage-Triggered NR Session Continuity
18 • LTE-NR & NR-NR Dual Connectivity and NR Carrier Aggregation
19 • Direct data forwarding between NG-RAN and E-UTRAN nodes for inter-system mobility
20 • Standard QCI Bearers Support
21 • Control Channel Beamforming
22 • Massive MIMO
23 • 256QAM Support (DL and UL)
24 • Beam Management
25 • TDD configuration support
26 • PF (Proportional Fairness) Scheduling
27 • Link Adaptation (DL and UL)
28 • Application Aware QoS
29
30 Please see below summary table of test cases and applicable technology.

31 Table 6-4: Video Streaming Service Test Case Selection summary

Applicable technology
NSA
LTE

SA

Test case Functional group

Test
Video Streaming
ID
6.2.1 Video Streaming – Stationary Test Service Y Y Y
6.2.2 Video Streaming – Handover between same Master eNB but Service N/A Y N/A
different O-RUs – Intra O-DU
6.2.3 Video Streaming – Handover between same MeNB but Service N/A Y N/A
different O-DUs – Inter O-DU Intra O-CU
6.2.4 Video Streaming – Handover between same MeNB but Service N/A Y N/A
different O-CUs – Inter O-CU

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 109
O-RAN.TIFG.E2E-Test.0-v04.00

6.2.5 Video Streaming – Handover between different MeNB while Service N/A Y N/A
staying connected to same SgNB
1
2

3 6.2.1 Video Streaming – Stationary Test

4 This scenario tests the video experience of a user streaming video over 4G and 5G network when the end user device is
5 stationary.

6 6.2.1.1 Test Description

7 Majority of the video streaming on the internet today uses ABR (Adaptive Bit Rate) streaming using HTTP protocol
8 over TCP/TLS or UDP/QUIC. As a part of this effort, we need to test the user’s video streaming experience when
9 connected to a telecom network over O-RAN system. This test case is applicable to both NSA and SA deployment, and
10 will need to be performed twice, once for NSA and repeated again for SA deployment as applicable.

11 6.2.1.2 Test Setup

12 The SUT in this test case would be O-eNB along with O-RU, O-DU and O-CU-CP/O-CU-UP for NSA deployments or
13 just the O-RU, O-DU and O-CU-CP/O-CU-UP for SA deployments. The O-RAN setup should support the ability to
14 perform this testing in different radio conditions as defined in Section 3.6. A 4G/5G core will be required to support the
15 basic functionality to authenticate and register an end user device in order to setup a PDN connection/PDU session. The
16 4G core will be used for O-RAN NSA testing, whereas 5G core will be used for O-RAN SA testing. The 4G/5G core
17 could be a completely emulated, partially emulated or a real non-emulated core. The application server(s) for this
18 testing should support video streaming using ABR protocol over HTTP/TCP and/or HTTP/QUIC as applicable. The end
19 user device (UE) used for testing could be a real UE or an emulated one. The test setup should include tools which have
20 the ability to collect traces on the elements and/or packet captures of communication between the elements. This could
21 be a built-in capability of the emulated/non-emulated network elements or an external tool. Optionally, if some of the
22 network elements or the application server are located remotely either in a cloud or on the internet, the additional
23 latency should be calculated and accounted for.

24 The O-eNB, O-RU, O-DU and O-CU-CP/O-CU-UP need to have the right configuration and software load. Tools
25 which emulate latency need to be used and configured on the O-RAN system (between O-RU, O-DU and O-CU-CP/O-
26 CU-UP) to emulate real-world deployment conditions. The end user device must be configured with the right user
27 credentials to be able to register and authenticate with the O-RAN system and the 4G/5G core. The end user device also
28 needs to be provisioned with the right application like a video streaming client to perform the tests. The 4G/5G core
29 network must be configured to support end user device used for testing. This includes supporting registration,
30 authentication and PDN connection/PDU session establishment for this end user device. The locations where the radio
31 conditions are excellent, good, fair and poor need to be identified within the serving cell.

32 All the elements in the network like O-RAN system, 4G/5G core and the application server need to have the ability to
33 capture traces to validate the successful execution of the test cases. The end user device needs to have the capability to
34 capture traces/packet capture to calculate the video streaming KPIs. Optionally, the network could have network taps
35 deployed in various legs of the network to get packet captures to validate successful execution of the test cases. Finally,
36 all these different components need to have connectivity with each other – the end user device should be able to connect
37 to O-RAN system, O-RAN system needs to be connected to the 4G/5G core which in turn should have connectivity to
38 the application server.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 110
O-RAN.TIFG.E2E-Test.0-v04.00

1 6.2.1.3 Test Methodology/Procedure

2 Ensure the end user device, O-RAN system, 4G/5G core and the application server have all been configured as outlined
3 in Section 6.2.1.2. All traces and packet captures need to be enabled for the duration of the testing to ensure all
4 communication between network elements can be captured and validated.

5 1. Power on the end user device in excellent radio condition and ensure it registers with the 4G/5G core for data
6 services.
7 2. Once the registration is complete, the end user device has to establish a PDN Session with the 4G core or PDU
8 session with the 5G core.
9 3. Open the video streaming client on the end user device and start a video streaming session over HTTP/TCP
10 protocol and let the video stream for at least 120 seconds.
11 4. Optionally, repeat the test by streaming a video session over the HTTP/QUIC protocol and stream the video content
12 for at least 120 seconds.
13 5. Repeat the test multiple times (> 10 times) and gather results.
14 6. Repeat the above steps 1 through 5 for the good, fair and poor radio conditions.
15

16 6.2.1.4 Test Expectation (expected results)

17 As a pre-validation, use the traces to validate a successful registration and PDN connection/PDU session setup by the
18 end user device without any errors. This is a prerequisite before these tests can be validated.

19 Validate the end user device is able to start streaming the video without delays and watch the video content. Ensure the
20 video is able to stream without stalls or intermittent buffering. Use the packet captures to validate there is no packet
21 drop or out of sequence packets which could impact customer experience and point to a misconfigured or flawed
22 system.

23 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
24 is performed in a controlled environment in good radio condition without the interference of external factors which
25 could impact the KPIs, example: use of internet to connect to remote servers/hosts could add latency, jitter and packet
26 loss issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the testing in
27 this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results outside
28 the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues may be
29 due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and the test
30 has to be repeated.

31 • Video Start time or Time to load first video frame – ~1.5 seconds
32 • Number of video stalls/buffering – < 1
33 • Duration of stalls in the video – < 5 seconds
34 • Video MOS Score – > 3.5
35

36 The values for the end user video streaming KPIs defined in Section 6.2 need to be included in the test report along with
37 the minimum configuration parameters included in Section 3.3. The following information should also be included in
38 the test report for the testing performed in different radio conditions to provide a comprehensive view of the test setup.

39 End user device side (real or emulated UE):

40 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

41 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

42 • Downlink transmission mode

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 111
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
2 (average sample per second)

3 The table below gives an example of the test report considering the mean and standard deviation of all test results that
4 have been captured.

5 Table 6-5 Example Test Report for Video Streaming – Stationary Test

Excellent Poor
Good Fair
(cell centre) (cell edge)
Video Streaming Video Streaming Video Streaming Video Streaming
over TCP/QUIC over TCP/QUIC over TCP/QUIC over TCP/QUIC
Video Start Time
Number of Video Stalls/buffering
Duration of stalls in the video
Video MOS Score
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
6 The end user video streaming experience can also be impacted by some of the features available (see Section 6.2) on the
7 O-eNB, O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other
8 features/functionality which could impact the end user’s video streaming experience should be included in the test
9 report to provide a comprehensive view of the testing.

10 6.2.2 Video Streaming – Handover between same Master eNB but different
11 O-RUs – Intra O-DU

12 This test scenario validates the user’s video streaming experience when the end user device (UE) is connected over
13 NSA to a 4G core and is in the process of a handover between two O-RAN subcomponents (two O-RUs) on the
14 Secondary gNB while remaining connected to the same Master eNB. As services like VoLTE or VoNR cannot be used
15 to test handover of secondary gNB, video streaming is used for this testing. Hence, this test scenario is only applicable
16 in an NSA deployment.

17 6.2.2.1 Test Description

18 The 5G O-RAN system has multiple sub-components like the O-RU, O-DU and the O-CU-CP/O-CU-UP. This setup
19 leads to multiple handover scenarios when the O-RAN system is connected as a Secondary gNB in an NSA

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 112
O-RAN.TIFG.E2E-Test.0-v04.00

1 deployment. This scenario tests the end user’s video streaming experience when the end user device is connected over
2 NSA to the 4G core and performs a handover between O-RUs which are connected to the same O-DU (and O-CU-
3 CP/O-CU-UP) on the Secondary gNB – Intra-O-DU handover. The end user device remains connected to the same
4 Master eNB through the entire handover process. This handover is agnostic to the 4G core as the handover occurs on
5 the O-RAN system. This test assesses the impact of the video streaming service on the end user device in this handover
6 scenario by monitoring the end user video streaming KPIs included in Section 6.2. This test case streams video using
7 ABR protocol over HTTP/TCP and/or HTTP/QUIC.

8 6.2.2.2 Test Setup

9 The SUT in this test case would be the Master eNB and a Secondary gNBs. The Master eNB will be an O-eNB and the
10 Secondary gNB will constitute of a pair of O-RUs – O-RU1 and O-RU2, which will connect to the same O-DU and O-
11 CU-CP/O-CU-UP. The O-eNB, O-RUs, O-DU and O-CU-CP/O-CU-UP have to comply with the O-RAN
12 specifications. A 4G core will be required to support the basic functionality to authenticate and register an end user
13 device in order to setup a PDN connection. The 4G core could be a completely emulated, partially emulated or a real
14 non-emulated core. The application server(s) for this testing should support video streaming using ABR protocol over
15 HTTP/TCP and/or HTTP/QUIC as applicable. The end user device (UE) used for testing could be a real UE or an
16 emulated one. The test setup should include tools which have the ability to collect traces on the elements and/or packet
17 captures of communication between the elements. This could be a built-in capability of the emulated/non-emulated
18 network elements or an external tool. Optionally, if some of the network elements or application server(s) are located
19 remotely either in a cloud or on the internet, the additional latency should be calculated and accounted for.

20 The O-eNB needs to have the right configuration and software load. The pair of O-RUs (O-RU1 and O-RU2) need to be
21 connected to the O-DU and O-CU-CP/O-CU-UP and all the components need to have the right configuration and
22 software load. Tools which emulate latency need to be used and configured on the O-RAN system (between O-RU, O-
23 DU and O-CU-CP/O-CU-UP) to emulate real-world deployment conditions. The end user device must be configured
24 with the right user credentials to be able to register and authenticate with the O-RAN system and the 4G core. The end
25 user device also needs to be provisioned with the right application like a video streaming client to perform the tests. The
26 4G core network must be configured to support end user device used for testing. This includes supporting registration,
27 authentication and PDN connection establishment for this end user device.

28 All the elements in the network like O-RAN system, 4G core and the application server need to have the ability to
29 capture traces to validate the successful execution of the test cases. The end user device needs to have the capability to
30 capture traces/packet capture to calculate the video streaming KPIs. Optionally, the network could have network taps
31 deployed in various legs of the network to get packet captures to validate successful execution of the test cases. Finally,
32 all these different components need to have connectivity with each other – the end user device should be able to connect
33 to O-RAN system(O-eNB, O-RUs, O-DU and O-CU-CP/O-CU-UP), O-RAN system needs to be connected to the 4G
34 core which in turn should have connectivity to the application server.

35 6.2.2.3 Test Methodology/Procedure

36 Ensure the end user device, O-RAN system, 4G core and the application server have all been configured as outlined in
37 Section 6.2.2.2. All traces and packet captures need to be enabled for the duration of the testing to ensure all
38 communication between network elements can be captured and validated.

39 1. Power on the end user device and ensure it registers with the 4G core for data services over NSA by connecting
40 over the O-eNB as Master eNB and O-RU1 of the O-RAN system as secondary gNB.
41 2. Once the registration is complete, the end user device has to establish a PDN connection with the 4G core.
42 3. Open the video streaming client on the end user device and start a streaming session over HTTP/TCP protocol.
43 4. Once the video streaming session has started, move the device so it can handover from O-RU1 to O-RU2 on the
44 Secondary gNB while it continues to use the O-eNB as the Master eNB.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 113
O-RAN.TIFG.E2E-Test.0-v04.00

1 5. Allow the end user device to stream video through the entire handover process. Measure the KPIs included in
2 Section 6.2 for this video streaming session.
3 6. Optionally, repeat steps 1 through 5 for a video streaming session which uses HTTP/QUIC protocol for streaming.
4 7. Repeat the test multiple times (> 10 times) and gather results.
5

6 6.2.2.4 Test Expectation (expected results)

7 As a pre-validation, use the traces to validate a successful registration and PDN connection setup by the end user device
8 without any errors using the Master eNB and O-RU1, O-DU and O-CU-CP/O-CU-UP as the Secondary gNB. This is a
9 prerequisite before these tests can be validated.

10 Validate the end user device is able to start streaming the video without delays and watch the video content through the
11 entire handover process. Ensure there is no stalls, downgrading of video quality or intermittent buffering of the video
12 content, especially during the handover process. Use the packet captures to validate there is no packet drop or out of
13 sequence packets which could impact customer experience and point to a misconfigured or flawed system.

14 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
15 is performed in a controlled environment in good radio condition without the interference of external factors which
16 could impact the KPIs, example: use of internet to connect to remote servers/hosts could add latency, jitter and packet
17 loss issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the testing in
18 this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results outside
19 the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues may be
20 due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and the test
21 has to be repeated.

22 • Video Start time or Time to load first video frame – ~1.5 seconds
23 • Number of video stalls/buffering – < 1
24 • Duration of stalls in the video – < 5 seconds
25 • Video MOS Score – > 3.5
26

27 The values for the end user video streaming KPIs defined in Section 6.2 need to be included in the test report along with
28 the minimum configuration parameters included in Section 3.3. The following information should also be included in
29 the test report to provide a comprehensive view of the test setup.

30 End user device side (real or emulated UE):

31 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

32 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

33 • Received downlink throughput (L1 and L3 PDCP layers) (average sample per second)

34 • Downlink transmission mode

35 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
36 (average sample per second)

37 The table below gives an example of the test report considering the mean and standard deviation of all test results that
38 have been captured.

39 Table 6-6 Example Test Report for Video Streaming – Handover between same Master eNB but different O-RUs
40 – Intra DU handover

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 114
O-RAN.TIFG.E2E-Test.0-v04.00

Video Streaming over HTTP/TCP Video Streaming over HTTP/QUIC


Video Start Time
Number of Video Stalls/buffering
Duration of stalls in the video
Video MOS Score
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
1 The end user video streaming experience can also be impacted by some of the features available (see Section 6.2) on the
2 O-eNB, O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other
3 features/functionality which could impact the end user’s video streaming experience should be included in the test
4 report to provide a comprehensive view of the testing.

5 6.2.3 Video Streaming – Handover between same MeNB but different O-DUs
6 – Inter O-DU Intra O-CU

7 This test scenario validates the user’s video streaming experience when the UE is connected over NSA to a 4G core and
8 the UE is in the process of a handover between two O-RAN subcomponents (two O-RUs and O-DUs) on the Secondary
9 gNB while remaining connected to the same Master eNB. As services like VoLTE or VoNR cannot be used to test
10 handover of secondary gNB, video streaming is used for this testing. Hence, this test scenario is only applicable in an
11 NSA deployment.

12 6.2.3.1 Test Description

13 The 5G O-RAN system has multiple sub-components like the O-RU, O-DU and the O-CU-CP/O-CU-UP. This setup
14 leads to multiple handover scenarios when the O-RAN system is connected as a Secondary gNB in an NSA
15 deployment. This scenario tests the end user’s video streaming experience when the end user device is connected over
16 NSA to the 4G core and performs a handover between O-RUs which are connected to different O-DUs, which in turn
17 are connected to the same O-CU-CP/O-CU-UP on the Secondary gNB – Inter-O-DU Intra-O-CU handover. The end
18 user device remains connected to the same Master eNB through the entire handover process. This handover is agnostic
19 to the 4G core as the handover occurs on the O-RAN system. This test assesses the impact of the video streaming
20 service on the end user device in this handover scenario by monitoring the KPIs included in Section 6.2 This test case
21 streams video using ABR protocol over HTTP/TCP and/or HTTP/QUIC.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 115
O-RAN.TIFG.E2E-Test.0-v04.00

1 6.2.3.2 Test Setup

2 The SUT in this test case would be the Master eNB and a Secondary gNBs. The Master eNB will be an O-eNB and the
3 Secondary gNB will constitute of a pair of O-RUs – O-RU1 and O-RU2, which are connected to a pair of O-DUs -DU1
4 and O-DU2, which are connected to the same O-CU-CP/O-CU-UP. The O-eNB, O-RUs, O-DUs and O-CU-CP/O-CU-
5 UP have to comply with the O-RAN specifications. The 4G core will be required to support the basic functionality to
6 authenticate and register an end user device in order to setup a PDN connection. The 4G/5G core could be a completely
7 emulated, partially emulated or a real non-emulated core. The application server(s) for this testing should support video
8 streaming using ABR protocol over HTTP/TCP and/or HTTP/QUIC as applicable. The end user device (UE) used for
9 testing could be a real UE or an emulated one. The test setup should include tools which have the ability to collect
10 traces on the elements and/or packet captures of communication between the elements. This could be a built-in
11 capability of the emulated/non-emulated network elements or an external tool. Optionally, if some of the network
12 elements or application server(s) are located remotely either in a cloud or on the internet, the additional latency should
13 be calculated and accounted for.

14 The O-eNB needs to have right the configuration and software load. The pair of O-RUs (O-RU1 and O-RU2) need to be
15 connected to the pair of O-DUs (O-DU1 and O-DU2), where O-RU1 is connected to O-DU1 and O-RU2 is connected
16 to O-DU2. Both the O-DUs, O-DU1 and O-DU2 need to be connected to the same O-CU-CP/O-CU-UP. All the O-
17 RAN components (O-RUs, O-DUs and O-CU-CP/O-CU-UP) need to have the right configuration and software load.
18 The end user device must be configured with the right user credentials to register and authenticate with the Master eNB
19 and the 4G core. The end user device also needs to be provisioned with the right application like a video streaming
20 client to perform the tests. The 4G core network must be configured to support end user device used for testing. This
21 includes supporting registration, authentication and PDN connection establishment for this end user device.

22 All the elements in the network like O-RAN system, 4G core and the application server need to have the ability to
23 capture traces to validate the successful execution of the test cases. The end user device needs to have the capability to
24 capture traces/packet capture to calculate the video streaming KPIs. Optionally, the network could have network taps
25 deployed in various legs of the network to get packet captures to validate successful execution of the test cases. Finally,
26 all these different components need to have connectivity with each other – the end user device should be able to connect
27 to O-RAN system (O-eNB, O-RUs, O-DUs and O-CU-CP/O-CU-UP), O-RAN system needs to be connected to the 4G
28 core which in turn should have connectivity to the application server.

29 6.2.3.3 Test Methodology/Procedure

30 Ensure the end user device, O-RAN system, 4G core and the application server have all been configured as outlined in
31 Section 6.2.3.2. All traces and packet captures need to be enabled for the duration of the testing to ensure all
32 communication between network elements can be captured and validated.

33 1. Power on the end user device and ensure it registers with the 4G core for data services over NSA by connecting
34 over the O-eNB as Master eNB and O-RU1 of the O-RAN system as secondary gNB.
35 2. Once the registration is complete, the end user device has to establish a PDN connection with the 4G core.
36 3. Open the video streaming client on the end user device and start a streaming session over HTTP/TCP protocol.
37 4. Once the video streaming session has started, move the device so it can handover from O-RU1 to O-RU2 (and in
38 turn O-DU1 to O-DU2) on the Secondary gNB while it continues to use the O-eNB as the Master eNB.
39 5. Allow the end user device to stream video through the entire handover process. Measure the KPIs included in
40 Section 6.2 for this video streaming session.
41 6. Optionally, repeat steps 1 through 5 for a video streaming session which uses HTTP/QUIC protocol for streaming.
42 7. Repeat the test multiple times (> 10 times) and gather results.
43

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 116
O-RAN.TIFG.E2E-Test.0-v04.00

1 6.2.3.4 Test Expectation (expected results)

2 As a pre-validation, use the traces to validate a successful registration and PDN connection setup by the end user device
3 without any errors using the O-eNB as Master eNB and O-RU1, O-DU1 and O-CU-CP/O-CU-UP as the Secondary
4 gNB. This is a prerequisite before these tests can be validated.

5 Validate the end user device is able to start streaming the video without delays and watch the video content through the
6 entire handover process. Ensure there is no stalls, downgrading of video quality or intermittent buffering of the video
7 content, especially during the handover process. Use the packet captures to validate there is no packet drop or out of
8 sequence packets which could impact customer experience and point to a misconfigured or flawed system.

9 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
10 is performed in a controlled environment in good radio condition without the interference of external factors which
11 could impact the KPIs, example: use of internet to connect to remote servers/hosts could add latency, jitter and packet
12 loss issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the testing in
13 this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results outside
14 the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues may be
15 due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and the test
16 has to be repeated.

17 • Video Start time or Time to load first video frame – ~1.5 seconds
18 • Number of video stalls/buffering – < 1
19 • Duration of stalls in the video – < 5 seconds
20 • Video MOS Score – > 3.5
21

22 The values for the end user video streaming KPIs defined in Section 6.2 need to be included in the test report along with
23 the minimum configuration parameters included in Section 3.3. The following information should also be included in
24 the test report to provide a comprehensive view of the test setup.

25 End user device side (real or emulated UE):

26 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

27 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

28 • Downlink transmission mode

29 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
30 (average sample per second)

31 The table below gives an example of the test report considering the mean and standard deviation of all test results that
32 have been captured.

33 Table 6-7 Example Test Report for Video Streaming – Handover between same Master eNB but different O-RUs
34 and O-DUs – Inter-O-DU Intra-O-CU handover

Video Streaming over HTTP/TCP Video Streaming over HTTP/QUIC


Video Start Time
Number of Video Stalls/buffering
Duration of stalls in the video
Video MOS Score
L1 DL Spectral efficiency [bps/Hz]

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 117
O-RAN.TIFG.E2E-Test.0-v04.00

UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
1 The end user video streaming experience can also be impacted by some of the features available (see Section 6.2) on the
2 O-eNB, O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other
3 features/functionality which could impact the end user’s video streaming experience should be included in the test
4 report to provide a comprehensive view of the testing.

5 6.2.4 Video Streaming – Handover between same MeNB but different O-CUs
6 – Inter O-CU

7 This test scenario validates the user’s video streaming experience when the end user device (UE) is connected over
8 NSA to a 4G core and is in the process of a handover between two Secondary gNB (O-RUs, O-DUs and O-CU-CP/O-
9 CU-UPs) while remaining connected to the same Master eNB. As services like VoLTE or VoNR cannot be used to test
10 handover of secondary gNB, video streaming is used for this testing. Hence, this test scenario is only applicable in an
11 NSA deployment.

12

13 6.2.4.1 Test Description

14 This scenario tests the impact of a handover on the video streaming service. The end user device is connected over NSA
15 to the 4G core and streaming video, while it performs a handover of the Secondary gNB (O-RU, O-DU and O-CU-
16 CP/O-CU-UP) to a new Secondary gNB (O-RU, O-DU and O-CU-CP/O-CU-UP), i.e. an inter-O-CU handover, when
17 still being connected to the same Master eNB. This test assesses the impact of the video streaming service on the end
18 user device in this handover scenario by monitoring the end user video streaming KPIs included in Section 6.2. This test
19 case streams video using ABR protocol over HTTP/TCP and/or HTTP/QUIC.

20 6.2.4.2 Test Setup

21 The SUT in this test case would be the O-eNB which would be the Master eNB and a pair of Secondary gNBs – O-
22 RU1, O-DU1, O-CU-CP/O-CU-UP1 and O-RU2, O-DU2, O-CU-CP/O-CU-UP2. The O-eNB, O-RUs, O-DUs and O-
23 CU-CP/O-CU-UPs have to comply with the O-RAN specifications. A 4G core will be required to support the basic
24 functionality to authenticate and register an end user device in order to setup a PDN connection. The 4G/5G core could
25 be a completely emulated, partially emulated or a real non-emulated core. The application server(s) for this testing
26 should support video streaming using ABR protocol over HTTP/TCP and/or HTTP/QUIC as applicable. The end user
27 device (UE) used for testing could be a real UE or an emulated one. The test setup should include tools which have the
28 ability to collect traces on the elements and/or packet captures of communication between the elements. This could be a
_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 118
O-RAN.TIFG.E2E-Test.0-v04.00

1 built-in capability of the emulated/non-emulated network elements or an external tool. Optionally, if some of the
2 network elements are located remotely either in a cloud or on the internet, the additional latency should be calculated
3 and accounted for.

4 The O-eNB, pair of gNBs (O-RUs, O-DUs and O-CU-CP/O-CU-UPs) need to have the right configuration and software
5 load. The pair of gNBs will be connected – O-RU1 will be connected to O-DU1, which in turn will be connected to O-
6 CU-CP/O-CU-UP1 and similarly O-RU2 will be connected to O-DU2, which in turn will be connected to O-CU-CP/O-
7 CU-UP2. The end user device must be configured with the right user credentials to be able to register and authenticate
8 with the Master eNB and the 4G core. The end user device also needs to be provisioned with the right application like a
9 video streaming client to perform the tests. The 4G core network must be configured to support end user device used for
10 testing. This includes supporting registration, authentication and PDN connection establishment for this end user device.

11 All the elements in the network like O-RAN (O-eNB and gNBs), 4G core and the application server need to have the
12 ability to capture traces to validate the successful execution of the test cases. The end user device needs to have the
13 capability to capture traces/packet capture to calculate the video streaming KPIs. Optionally, the network could have
14 network taps deployed in various legs of the network to get packet captures to validate successful execution of the test
15 cases. Finally, all these different components need to have connectivity with each other – the end user device should be
16 able to connect to O-RAN system(O-eNB and gNBs), O-RAN system needs to be connected to the 4G core which in
17 turn should have connectivity to the application server.

18 6.2.4.3 Test Methodology/Procedure

19 Ensure the end user device, O-RAN system, 4G core and the application server have all been configured as outlined in
20 Section 6.2.4.2. All traces and packet captures need to be enabled for the duration of the testing to ensure all
21 communication between network elements can be captured and validated.

22 1. Power on the end user device and ensure it registers with the 4G core for data services over NSA by connecting
23 over the O-eNB as Master eNB and O-RU1, O-DU1, O-CU-CP/O-CU-UP1 as secondary gNB.
24 2. Once the registration is complete, the end user device has to establish a PDN connection with the 4G core.
25 3. Open the video streaming client on the end user device and start a streaming session over HTTP/TCP protocol.
26 4. Once the video streaming session has started, move the device so the secondary gNB is handed over from O-RU1
27 to O-RU2 (i.e. from O-RU1, O-DU1 and O-CU-CP/O-CU-UP1, to O-RU2, O-DU2 and O-CU-CP/O-CU-UP2),
28 while it continues to use the O-eNB as the Master eNB.
29 5. Allow the end user device to stream video through the entire handover process. Measure the KPIs included in
30 Section 6.2 for this video streaming session.
31 6. Optionally, repeat steps 1 through 5 for a video streaming session which uses HTTP/QUIC protocol for streaming.
32 7. Repeat the test multiple times (> 10 times) and gather results.
33

34 6.2.4.4 Test Expectation (expected results)

35 As a pre-validation, use the traces to validate a successful registration and PDN connection setup by the end user device
36 without any errors using the Master eNB and the O-RU1, O-DU1 and O-CU-CP/O-CU-UP1 as secondary gNB. This is
37 a prerequisite before these tests can be validated.

38 Validate the end user device is able to start streaming the video without delays and watch the video content through the
39 entire handover process. Ensure there is no stalls, downgrading of video quality or intermittent buffering of the video
40 content, especially during the handover process. Use the packet captures to validate there is no packet drop or out of
41 sequence packets which could impact customer experience and point to a misconfigured or flawed system.

42 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
43 is performed in a controlled environment in good radio condition without the interference of external factors which
44 could impact the KPIs, example: use of internet to connect to remote servers/hosts could add latency, jitter and packet

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 119
O-RAN.TIFG.E2E-Test.0-v04.00

1 loss issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the testing in
2 this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results outside
3 the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues may be
4 due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and the test
5 has to be repeated.

6 • Video Start time or Time to load first video frame – ~1.5 seconds
7 • Number of video stalls/buffering – < 1
8 • Duration of stalls in the video – < 5 seconds
9 • Video MOS Score – > 3.5
10

11 The values for the end user video streaming KPIs defined in Section 6.2 need to be included in the test report along with
12 the minimum configuration parameters included in Section 3.3. The following information should also be included in
13 the test report to provide a comprehensive view of the test setup.

14 End user device side (real or emulated UE):

15 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

16 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

17 • Downlink transmission mode

18 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
19 (average sample per second)

20 The table below gives an example of the test report considering the mean and standard deviation of all test results that
21 have been captured.

22 Table 6-8 Example Test Report for Video Streaming – Handover between same Master eNB but different O-RUs
23 – Inter CU handover

Video Streaming over HTTP/TCP Video Streaming over HTTP/QUIC


Video Start Time
Number of Video Stalls/buffering
Duration of stalls in the video
Video MOS Score
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 120
O-RAN.TIFG.E2E-Test.0-v04.00

PDSCH BLER [%]


1 The end user video streaming experience can also be impacted by some of the features available (see Section 6.2) on the
2 O-eNB, O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other
3 features/functionality which could impact the end user’s video streaming experience should be included in the test
4 report to provide a comprehensive view of the testing.

5 6.2.5 Video Streaming – Handover between different MeNB while staying


6 connected to same SgNB

7 This test scenario validates the user’s video streaming experience when the end user device (UE) is connected over
8 NSA to a 4G core and the UE is in the process of a handover between two Master eNBs (two O-eNB) while staying
9 connected to the same Secondary gNB (O-RU, O-DU and O-CU-CP/O-CU-UP). As services like VoLTE or VoNR
10 cannot be used to test handover of secondary gNB, video streaming is used for this testing. Hence, this test scenario is
11 only applicable in an NSA deployment.

12

13 6.2.5.1 Test Description

14 This scenario tests the impact of a handover on the video streaming service. The end user device is connected over NSA
15 to the 4G core and streaming video while it performs handover of the Master eNB to a new Master eNB, i.e. O-eNB1 to
16 O-eNB2, while staying connected to the same Secondary gNB (O-RU, O-DU and O-CU-CP/O-CU-UP).This test
17 assesses the impact of the video streaming service on the end user device in this handover scenario by monitoring the
18 end user video streaming KPIs included in Section 6.2. This test case streams video using ABR protocol over
19 HTTP/TCP and/or HTTP/QUIC.

20 6.2.5.2 Test Setup

21 The SUT in this test case would be a pair of O-eNBs (O-eNB1 and O-eNB2) which act as Master eNBs and a
22 Secondary gNBs – (O-RU, O-DU and O-CU-CP/O-CU-UP). The pair of O-eNBs, O-RU, O-DU and O-CU-CP/O-CU-
23 UP have to comply with the O-RAN specifications. A 4G core will be required to support the basic functionality to
24 authenticate and register an end user device in order to setup a PDN connection. The 4G/5G core could be a completely
25 emulated, partially emulated or a real non-emulated core. The application server(s) for this testing should support video
26 streaming using ABR protocol over HTTP/TCP and/or HTTP/QUIC as applicable. The end user device (UE) used for
27 testing could be a real UE or an emulated one. The test setup should include tools which have the ability to collect
28 traces on the elements and/or packet captures of communication between the elements. This could be a built-in
29 capability of the emulated/non-emulated network elements or an external tool. Optionally, if some of the network
30 elements are located remotely either in a cloud or on the internet, the additional latency should be calculated and
31 accounted for.

32 The pair of O-eNBs (O-eNB1 and O-eNB2) and the gNB (O-RU, O-DU and O-CU-CP/O-CU-UP) need to have the
33 right configuration and software load. Tools which emulate latency need to be used and configured on the O-RAN
34 system (between O-RU, O-DU and O-CU-CP/O-CU-UP) to emulate real-world deployment conditions. The end user
35 device must be configured with the right user credentials to be able to register and authenticate with the O-RAN (O-
36 eNBs and gNB) and the 4G core. The end user device also needs to be provisioned with the right application like a
37 video streaming client to perform the tests. The 4G core network must be configured to support end user device used for
38 testing. This includes supporting registration, authentication and PDN session establishment for this end user device.

39 All the elements in the network like O-RAN system, 4G core and the application server need to have the ability to
40 capture traces to validate the successful execution of the test cases. The end user device needs to have the capability to

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 121
O-RAN.TIFG.E2E-Test.0-v04.00

1 capture traces/packet capture to calculate the video streaming KPIs. Optionally, the network could have network taps
2 deployed in various legs of the network to get packet captures to validate successful execution of the test cases. Finally,
3 all these different components need to have connectivity with each other – the end user device should be able to connect
4 to O-RAN system(O-eNBs and gNB), O-RAN system needs to be connected to the 4G core which in turn should have
5 connectivity to the application server.

6 6.2.5.3 Test Methodology/Procedure

7 Ensure the end user device, O-RAN system, 4G core and the application server have all been configured as outlined in
8 Section 6.2.5.2. All traces and packet captures need to be enabled for the duration of the testing to ensure all
9 communication between network elements can be captured and validated.

10 1. Power on the end user device and ensure it registers with the 4G core for data services over NSA by connecting
11 over the O-eNB1 as Master eNB and O-RU, O-DU and O-CU-CP/O-CU-UP as Secondary gNB.
12 2. Once the registration is complete, the end user device has to establish a PDN connection with the 4G core.
13 3. Open the video streaming client on the end user device and start a streaming session over HTTP/TCP protocol.
14 4. Once the video streaming session has started, move the end user device so it performs a handover of the Master
15 eNB from O-eNB1 to O-eNB2, while staying connected to the same Secondary gNB i.e. O-RU, O-DU and O-CU-
16 CP/O-CU-UP.
17 5. Allow the end user device to stream video through the entire handover process. Measure the KPIs included in
18 Section 6.2 for this video streaming session.
19 6. Optionally, repeat steps 1 through 6 for a video streaming session which uses HTTP/QUIC protocol for streaming.
20 7. Repeat the test multiple times (> 10 times) and gather results.
21

22 6.2.5.4 Test Expectation (expected results)

23 As a pre-validation, use the traces to validate a successful registration and PDN connection setup by the end user device
24 without any errors using O-eNB1 as the Master eNB and the O-RU, O-DU and O-CU-CP/O-CU-UP as secondary gNB.
25 This is a prerequisite before these tests can be validated.

26 Validate the end user device is able to start streaming the video without delays and watch the video content through the
27 entire handover process. Ensure there is no stalls, downgrading of video quality or intermittent buffering of the video
28 content, especially during the handover process. Use the packet captures to validate there is no packet drop or out of
29 sequence packets which could impact customer experience and point to a misconfigured or flawed system.

30 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
31 is performed in a controlled environment in good radio condition without the interference of external factors which
32 could impact the KPIs, example: use of internet to connect to remote servers/hosts could add latency, jitter and packet
33 loss issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the testing in
34 this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results outside
35 the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues may be
36 due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and the test
37 has to be repeated.

38 • Video Start time or Time to load first video frame – ~1.5 seconds
39 • Number of video stalls/buffering – < 1
40 • Duration of stalls in the video – < 5 seconds
41 • Video MOS Score – > 3.5
42

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 122
O-RAN.TIFG.E2E-Test.0-v04.00

1 The values for the end user video streaming KPIs defined in Section 6.2 need to be included in the test report along with
2 the minimum configuration parameters included in Section 3.3. The following information should also be included in
3 the test report to provide a comprehensive view of the test setup.

4 End user device side (real or emulated UE):

5 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

6 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

7 • Downlink transmission mode

8 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
9 (average sample per second)

10 The table below gives an example of the test report considering the mean and standard deviation of all test results that
11 have been captured.

12 Table 6-9 Example Test Report for Video Streaming – Handover between Master eNB while staying connected
13 to the same Secondary gNB

Video Streaming over TCP/QUIC Video Streaming over TCP/QUIC


Video Start Time
Number of Video Stalls/buffering
Duration of stalls in the video
Video MOS Score
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
14 The end user video streaming experience can also be impacted by some of the features available (see Section 6.2) on the
15 O-eNB, O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other
16 features/functionality which could impact the end user’s video streaming experience should be included in the test
17 report to provide a comprehensive view of the testing.

18 6.3 Voice Services – Voice over LTE (VoLTE)


19 Voice service forms another basic service which is provided to a customer on a telecom network. Even though the
20 earlier 2G/3G network provide voice service through circuit-switched network, this document focuses on voice service

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 123
O-RAN.TIFG.E2E-Test.0-v04.00

1 provided using a packet switched network. This Section of the document includes test case to validate the Voice over
2 LTE service in different scenarios. The KPIs which will be monitored to assess the voice service are included below

3 • CSSR – Call Setup Success Rate % – Total number of calls which were successful by the total number of
4 calls made as a percentage.
5 • CST – Call Setup Time – Time taken from the initial SIP INVITE to when the SIP 180 Ringing is received
6 in seconds.
7 • MOS Score – Mean Opinion Score for the voice call. This needs to be measured on both ends of the Voice
8 call – mobile originated and mobile terminated.
9 • Mute Rate – This is the percentage of calls which were muted in both directions (calls with RTP loss of >
10 3-4s in both directions are considered muted call). This needs to be measured on both ends of the voice call
11 – mobile originated & mobile terminated and counted only once per call.
12 • One Way Calls – This is the percentage of calls which were muted in any one direction (calls with RTP
13 loss of > 3-4s in one direction only are considered one-way calls). This needs to be monitored on both ends
14 of the voice call – mobile originated & mobile terminated and counted only once per call.
15 • RTP Packet Loss % - Number of RTP packets which were dropped/lost in uplink/downlink direction as a
16 percentage of total packets. This needs to be measured on both ends of the voice call – mobile originated
17 and mobile terminated.
18

19 Along with the monitoring and validation of these services using user experience KPIs, the O-RAN systems also need
20 to be monitored. The end user service experience can also be impacted by some of the features available on the O-RAN
21 system. Some of these features have been included below. As a part of the voice services testing, details of these
22 features need to be included in the test report to get a comprehensive view of the setup used for testing. If additional
23 features or functionalities have been enabled during this testing that impact the end user voice experience, those need be
24 included in the test report as well.

25 • RLC in Unacknowledged Mode


26 • Robust Header Compression
27 • DRX
28 • Dynamic GBR Admission Control
29 • TTI Bundling
30 • VoLTE Inactivity Timer
31 • Frequency Hopping
32 • Multi-Target RRC Connection Re-Establishment
33 • VoLTE HARQ
34 • Coordinated Multi Point (DL and UL)
35 • VoLTE Quality Enhancement
36 • Packet Loss Detection
37 • Voice Codec Aware scheduler
38 • NR to LTE PS Redirection/Cell Reselection/Handover
39 • LTE to NR PS Redirection/Cell Reselection/Handover
40 Please see below summary table of test cases and applicable technology.

41 Table 6-10: VoLTE Test Case Selection summary

Applicable technology

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 124
O-RAN.TIFG.E2E-Test.0-v04.00

NSA
LTE

SA
Test case Functional group

Test ID Voice service (VoLTE)


6.3.1 VoLTE Stationary Test Service Y N/A N/A
6.3.2 VoLTE handover Test (intra) Service Y N/A N/A
6.3.3 Voice Service – LTE to NR / NR to LTE handover test Service Y N/A Y

1 6.3.1 VoLTE Stationary Test

2 This scenario tests the voice service experience on an LTE network – Voice over LTE (VoLTE) when the end user
3 device is stationary.

4 6.3.1.1 Test Description

5 With penetration of LTE, VoLTE has become the primary method of providing voice service. VoLTE uses IP packets
6 to send and receive voice packets, with the IP packets being transferred over LTE. IP Multimedia System (IMS) forms
7 an important part of VoLTE as it is used to setup control and data plane needed for VoLTE communication. As voice
8 service is latency sensitive, the 4G core interacts with the IMS core to setup different bearers for Voice traffic – QCI-5
9 for VoLTE control plane and QCI-1 for VoLTE data plane. This test case is applicable when UE is connected over an
10 O-RAN system to a 4G core in an NSA deployment.

11 6.3.1.2 Test Setup

12 The SUT in this test case would be a O-eNB which would be the Master eNB and a Secondary gNB. As most of the
13 current NSA deployment use the 4G eNB to provide voice services, the use of a secondary gNB is optional. We have
14 however included the Secondary gNB in this test scenario, but it is only applicable if the gNB plays a role in
15 establishing the control plane or data plane for a voice call. The O-eNB, gNB and the components within these have to
16 comply with the O-RAN specifications. The O-RAN setup should support the ability to perform this testing in different
17 radio conditions as defined in Section 3.6. A 4G core will be required to support the basic functionality to authenticate
18 and register an end user device in order to setup a PDN connection. An IMS core will be required to register the end
19 user device to support voice services on a 4G network. The 4G and IMS cores could be a completely emulated, partially
20 emulated or real non-emulated cores. We will need at least two end user devices (UE) which can be a real UEs or an
21 emulated one, and both have to support voice service using VoLTE. The end user devices will serve as Mobile
22 Originated (MO) and Mobile Terminated (MT) end user devices forming the two ends of the voice call. Going forward
23 in this section, these end user devices will be referred to as MO end user device and MT end user devices to represent
24 the role they plan in the voice call. The test setup should include tools which have the ability to collect traces on the
25 elements and/or packet captures of communication between the elements. This could be a built-in capability of the
26 emulated/non-emulated network elements or an external tool. Optionally, if some of the network elements are located
27 remotely either in a cloud or on the internet, the additional latency should be calculated and accounted for.

28 The Master eNB, Secondary gNB and their components (O-eNB, O-RU, O-DU and O-CU-CP/O-CU-UP) need to have
29 the right configuration and software load. The end user device must be configured with the right user credentials to be
30 able to register and authenticate with the O-RAN system and the 4G core. The end user devices also need to be
31 provisioned with the user credentials to attach to the 4G core and register with the IMS core to perform voice call using
32 VoLTE. The 4G core network and IMS core must be configured to support end user devices used for testing. This
33 includes supporting registration, authentication and PDN connection establishment for these end user devices. This also
34 includes provisioning the IMS core to support registration of the end user devices to make voice calls over VoLTE, and
35 dynamically setting up dedicated bearers for voice calls. The locations where the radio conditions are excellent, good,
36 fair and poor need to be identified within the serving cell.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 125
O-RAN.TIFG.E2E-Test.0-v04.00

1 All the elements in the network like O-RAN system, 4G core and the IMS Core need to have the ability to capture
2 traces to validate the successful execution of the test cases. The end user devices need to have the capability to capture
3 traces/packet capture to calculate the VoLTE KPIs. Optionally, the network could have network taps deployed in
4 various legs of the network to get packet captures to validate successful execution of the test cases. Finally, all these
5 different components need to have connectivity with each other – the end user device should be able to connect to O-
6 RAN system(O-eNBs and gNBs), O-RAN system needs to be connected to the 4G core which in turn should have
7 connectivity to the IMS Core.

8 6.3.1.3 Test Methodology/Procedure

9 Ensure the end user devices, O-RAN system, 4G core and the IMS Core have all been configured as outlined in Section
10 6.3.1.2. In this test scenario, both the mobile originated and mobile terminated end user devices are going to use the
11 same O-RAN (i.e. same O-eNB), 4G and IMS core to perform the end-to-end voice call. All traces and packet captures
12 need to be enabled for the duration of the testing to ensure all communication between network elements can be
13 captured and validated.

14 1. Power on the two end user devices in excellent radio condition and ensure both of the end user devices connect to
15 the 4G core over the Master O-eNB and optionally secondary gNB.
16 2. Ensure both the MO and MT end user devices can establish a PDN connection with the 4G core. Once the PDN
17 connection has been setup, both the end user devices have to register with the IMS core to support voice services.
18 3. Use the MO end user device to call the MT end user device. Validate the MT end user device is able to receive and
19 answer the call.
20 4. Continue to have two-way voice communication on the voice call for at least 5 minutes before terminating it.
21 5. Repeat the test multiple times (>10 times) and gather results.
22 6. Repeat the above steps 1 through 5 for the MO and MT end user devices in good, fair and poor radio conditions.
23

24 6.3.1.4 Test Expectation (expected results)

25 As a pre-validation, use the traces to validate a successful PDN connection setup by the end user devices without any
26 errors using the Master eNB and optionally the secondary gNB. Also validate both the end user devices are able to
27 register with the IMS core for voice services. This is a prerequisite before these tests can be validated.

28 Validate the end user devices are able to make a voice call between each other by dynamically setting up a QCI-1 bearer
29 to transfer voice packets over RTP. Ensure the Call Setup Time is reasonable, the voice quality from both parties are
30 clear and audible without one-way or intermittent muting. Use the packet captures to validate there is no RTP packet
31 drops or high RTP packet jitter which could cause voice muting issues. Use the packet captures to ensure there are no
32 out-of-sequence packets which could impact customer’s voice experience.

33 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
34 is performed in a controlled environment in good radio condition without the interference of external factors which
35 could impact the KPIs, example: use of internet to connect to remote network nodes could add latency, jitter and packet
36 loss issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the testing in
37 this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results outside
38 the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues may be
39 due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and the test
40 has to be repeated.

41 • CSSR – Call Setup Success Rate % –> 99%.


42 • CST – Call Setup Time – < 2.5s
43 • MOS Score – > 3.5
44 • Mute Rate % – < 1%

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 126
O-RAN.TIFG.E2E-Test.0-v04.00

1 • One Way Call % - < 1%


2 • RTP Packet Loss % - < 1%
3

4 These end user voice KPI values included in Section 6.3 need to be included in the test report along with the minimum
5 configuration parameters included in Section 3.3. The following information should also be included in the test report
6 for the testing performed in different radio conditions to provide a comprehensive view of the test setup.

7 End user device side (real or emulated UE):

8 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

9 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

10 • Downlink transmission mode

11 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
12 (average sample per second)

13 The table below gives an example of the test report considering the mean and standard deviation of all test results that
14 have been captured.

15 Table 6-11 Example Test Report for Voice over LTE Testing – Stationary Test

Excellent Poor
Good Fair
(cell centre) (cell edge)
VoLTE MO/MT VoLTE MO/MT VoLTE MO/MT VoLTE MO/MT

Call Setup Success Rate


Call Setup time
MOS Score
Mute Rate
One Way Call
RTP Packet Loss
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
16 The end user VoLTE experience can also be impacted by some of the features available (see Section 6.3) on the O-eNB,
17 O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other features/functionality which

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 127
O-RAN.TIFG.E2E-Test.0-v04.00

1 could impact the end user’s voice service experience should be included in the test report to provide a comprehensive
2 view of the testing.

3 6.3.2 VoLTE Handover Test

4 This test section is FFS

5 6.3.3 Voice Service - LTE and NR handover tests

6 This test scenario validates the user’s voice experience, when the UE is in a voice call, and performs a handover from
7 LTE network to a 5G network and vice versa. This test scenario is applicable for a 5G SA deployment.

8 6.3.3.1 Test Description

9 Voice service is one of the basic services provided on the telecommunication network. Voice service on the 4G network
10 is provided using VoLTE. Similarly, voice service on the 5G network is provided using packet switch technology called
11 Voice over New Radio (VoNR). As 5G network is being deployed by a telecommunication service provider, the service
12 provider will need to support 4G and 5G network, and thus support VoLTE, VoNR and the handover between the two
13 voice services. This scenario tests the end user’s voice experience when the end user device performs a handover from
14 VoLTE to VoNR and vice versa.

15 6.3.3.2 Test Setup

16 The SUT in this test case would be an O-eNB along with a gNB (O-RU, O-DU and O-CU-CP/O-CU-UP). We would
17 also need an interworking 4G-5G core (referred to as 4G-5G core going forward) which supports a combined anchor
18 point for 4G and 5G, i.e. SMF+PGW-C and UPF+PGW-U. The eNB connects to a 4G-5G core over the 4G interfaces
19 like S1 to provide 4G LTE service and the O-CU-CP/O-CU-UP, O-DU and O-RU would connect to a 4G-5G core over
20 the 5G interfaces like N2 and N3 to provide 5G SA service. The O-eNB, gNB and the components within these have to
21 comply with the O-RAN specifications. The 4G-5G core will be required to support the basic functionality to
22 authenticate and register an end user device in order to setup a PDN connection/PDU session. An IMS core will be
23 required to register the end user device to support voice services on a 4G and 5G network. The 4G and 5G core have to
24 interwork using the N26 interface between the MME and AMF to support seamless handover. We are recommending
25 the use of N26 interface to ensure better customer experience when the end user device performs a handover between
26 4G and 5G. The 4G-5G and IMS cores could be a completely emulated, partially emulated or real non-emulated cores.
27 We will need at least two end user devices (UE) which can be a real UEs or an emulated one, and both have to support
28 voice service using VoLTE, VoNR and the capability to handover from VoLTE to VoNR and vice versa. The end user
29 devices will serve as Mobile Originated (MO) and Mobile Terminated (MT) end user devices forming the two ends of
30 the voice call. Going forward in this section, these end user devices will be referred to as MO end user device and MT
31 end user devices to represent their role in the voice call. The test setup should include tools which have the ability to
32 collect traces on the elements and/or packet captures of communication between the elements. This could be a built-in
33 capability of the emulated/non-emulated network elements or an external tool. Optionally, if some of the network
34 elements are located remotely either in a cloud or on the internet, the additional latency should be calculated and
35 accounted for.

36 The O-eNB, gNB and their components (O-RU, O-DU and O-CU-CP/O-CU-UP) need to have the right configuration
37 and software load. The end user devices must be configured with the right user credentials to be able to register and
38 authenticate with the O-RAN system and the 4G-5G core. The end user devices also need to be provisioned with the
39 user credentials to support attach procedure to the 4G-5G core, registering to the IMS core and performing a voice call
40 using VoLTE and VoNR. This includes supporting registration, authentication and PDN connection/PDU Session
41 establishment for these end user devices. This also includes provisioning the IMS core to support registration of the end

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 128
O-RAN.TIFG.E2E-Test.0-v04.00

1 user devices to make voice calls over VoLTE and VoNR, and dynamically setting up dedicated bearers/QoS Flows for
2 voice calls.

3 All the elements in the network like O-RAN system, 4G-5G core and the IMS Core need to have the ability to capture
4 traces to validate the successful execution of the test cases. The end user devices need to have the capability to capture
5 traces/packet capture to calculate the VoLTE and VoNR KPIs. Optionally, the network could have network taps
6 deployed in various legs of the network to get packet captures to validate successful execution of the test cases. Finally,
7 all these different components need to have connectivity with each other – the end user device should be able to connect
8 to O-RAN system(eNBs and gNBs), O-RAN system needs to be connected to the 4G-5G core which in turn should
9 have connectivity to the IMS Core.

10 6.3.3.3 Test Methodology/Procedure

11 Ensure the end user devices, O-RAN system, 4G-5G core and the IMS Core have all been configured as outlined in
12 Section 6.3.3.2. In this test scenario, both the mobile originated and mobile terminated end user devices are going to use
13 the eNB and gNB to connect to the same 4G-5G and IMS core to perform the end-to-end voice call. All traces and
14 packet captures need to be enabled for the duration of the testing to ensure all communication between network
15 elements can be captured and validated.

16 The below section gives the steps to perform a VoLTE to VoNR handover, followed by VoNR to VoLTE handover

17 1. Power on the two end user devices and ensure both of the end user devices can connect to the 4G-5G core over the
18 O-eNB.
19 2. Ensure both the MO and MT end user devices can establish a PDN connection with the 4G-5G core. Once the PDN
20 connection has been setup, both the end user devices have to register with the IMS core to support voice services.
21 3. Use the MO end user device to call the MT end user device. Validate the MT end user device is able to receive and
22 answer the call.
23 4. Once the two-way call has been setup and communication is going back and forth between the two end user
24 devices, move the MO device to the O-RU coverage of the gNB, thus forcing a handover from 4G to 5G, hence
25 forcing a VoLTE to VoNR handover. Continue two-way voice communication through the handover process and
26 terminate the call once the handover process is complete. Measure the voice KPIs included in Section 6.3.
27 5. At this point in time, the MO end user device is in 5G coverage and registered to the 4G-5G core. The MT end user
28 device is in 4G coverage and registered to the 4G-5G core. Both end user devices should still be registered to the
29 IMS core.
30 6. Use the MO end user device to call the MT end user device. Validate the MT end user device is able to receive and
31 answer the call.
32 7. Once the two-way call has been setup and communication is going back and forth between the two end user
33 devices, move the MO device to the O-eNB coverage, thus forcing a handover from 5G to 4G, hence forcing a
34 VoNR to VoLTE handover. Continue two-way voice communication through the handover process and terminate
35 the call once the handover process is complete. Measure the voice KPIs included in Section 6.3.
36 8. Repeat the test multiple times (> 10 times) and gather results.
37

38 6.3.3.4 Test Expectation (expected results)

39 As a pre-validation, use the traces to validate a successful registration and PDN connection setup by the end user
40 devices without any errors using the O-eNB when in 4G coverage. Similarly, use traces to validate successful
41 registration and PDU session setup by the end user device without any errors using O-RU, O-DU and O-CU-CP/O-CU-
42 UP when in 5G coverage. Also validate both the end user devices are able to register with the IMS core for voice
43 services. This is a prerequisite before these tests can be validated.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 129
O-RAN.TIFG.E2E-Test.0-v04.00

1 Validate the end user devices are able to make a voice call between each other by dynamically setting up a QCI-1/5QI-1
2 bearer to transfer voice packets over RTP. Ensure the Call Setup Time is reasonable, the voice quality from both parties
3 are clear and audible without one-way or intermittent muting. Ensure the voice quality is not impacted during the
4 handover process – VoLTE to VoNR and VoNR to VoLTE. Use the packet captures to validate there is no RTP packet
5 drops or high RTP packet jitter which could cause voice muting issues, especially during the handover process. Use the
6 packet captures to ensure there are no out-of-sequence packets which could impact customer’s voice experience.

7 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
8 is performed in a controlled environment in good radio condition without the interference of external factors which
9 could impact the KPIs, example: use of internet to connect to remote network nodes could add latency, jitter and packet
10 loss issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the testing in
11 this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results outside
12 the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues may be
13 due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and the test
14 has to be repeated.

15 • CSSR – Call Setup Success Rate % –> 99%.


16 • CST – Call Setup Time – < 2.5s
17 • MOS Score – > 3.5
18 • Mute Rate % – < 1%
19 • One Way Call % - < 1%
20 • RTP Packet Loss % - < 1%
21

22 These end user voice KPI values included in Section 6.3 need to be included in the test report along with the minimum
23 configuration parameters included in Section 3.3. The following information should also be included in the test report
24 for the testing performed in different radio conditions to provide a comprehensive view of the test setup.

25 End user device side (real or emulated UE):

26 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

27 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

28 • Received downlink throughput (L1 and L3 PDCP layers) (average sample per second)

29 • Downlink transmission mode

30 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
31 (average sample per second)

32 The table below gives an example of the test report considering the mean and standard deviation of all test results that
33 have been captured.

34 Table 6-12 Example Test Report for Voice service handover testing – LTE and NR

VoLTE to VoNR handover VoNR to VoLTE handover

Call Setup Success Rate


Call Setup time
MOS Score
Mute Rate
One Way Call
RTP Packet Loss

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 130
O-RAN.TIFG.E2E-Test.0-v04.00

L1 DL Spectral efficiency [bps/Hz]


UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
1 The end user VoLTE/VoNR experience can also be impacted by some of the features available (see Section 6.3) on the
2 O-eNB, O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other
3 features/functionality which could impact the end user’s voice service experience should be included in the test report to
4 provide a comprehensive view of the testing.

5 6.4 Voice Service – EPS Fallback


6 Voice service is one of the basic services which must be supported on every telecom network. Upgrade of telecom
7 network occurs in phases, and this is no different for 5G. The telecom network may not be able to support voice service
8 on 5G during this phase for multiple reasons – the 5G network may not be deployed nationwide, or 5G network may not
9 be tuned to support voice service or the devices may not be able to support voice service on 5G. However, there can be
10 no interruption to voice service during this transition phase. EPS fallback is the method used to support voice services
11 during this phase, where voice services are continued to be supported on the legacy LTE network using VoLTE by
12 forcing the device to fallback to LTE to make or receive a call. The KPIs which will be monitored to assess the voice
13 service are included below

14 • CSSR – Call Setup Success Rate % – Total number of calls which were successful by the total number of
15 calls made as a percentage.
16 • CST – Call Setup Time – Time taken from the initial SIP INVITE to when the SIP 180 Ringing is received
17 in seconds.
18 • MOS Score – Mean Opinion Score for the voice call. This needs to be measured on both ends of the Voice
19 call – mobile originated and mobile terminated.
20 • Mute Rate % – This is the percentage of calls which were muted in both directions (calls with RTP loss of
21 > 3-4s in both directions are considered muted call). This needs to be measured on both ends of the Voice
22 call – mobile originated and mobile terminated and counted only once per call.
23 • One Way Calls % – This is the percentage of calls which were muted in any one direction (calls with RTP
24 loss of > 3-4s in one direction only are considered one-way calls). This needs to be monitored on both ends
25 of the Voice call – mobile originated and mobile terminated and counted only once per call.
26 • RTP Packet Loss % - Number of RTP packets which were dropped/lost in uplink/downlink direction as a
27 percentage of total packets. This needs to be measured on both ends of the Voice call – mobile originated
28 and mobile terminated.
29

30 Along with the monitoring and validation of these services using user experience KPIs, the O-RAN systems also need
31 to be monitored. The end user service experience can also be impacted by some of the features available on the O-RAN

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 131
O-RAN.TIFG.E2E-Test.0-v04.00

1 system. Some of these features have been included below. As a part of the voice services testing, details of these
2 features need to be included in the test report to get a comprehensive view of the setup used for testing. If additional
3 features or functionalities have been enabled during this testing that impact the end user voice experience, those need be
4 included in the test report as well.

5 • EPS Fallback for IMS Voice


6 • NR to EPS Mobility
7 • RLC in Unacknowledged Mode
8 • Robust Header Compression
9 • DRX
10 • Dynamic GBR Admission Control
11 • TTI Bundling
12 • VoLTE Inactivity Timer
13 • Frequency Hopping
14 • Multi-Target RRC Connection Re-Establishment
15 • VoLTE HARQ
16 • Coordinated Multi Point (DL and UL)
17 • VoLTE Quality Enhancement
18 • Packet Loss Detection
19 • Voice Codec Aware scheduler
20
21 Please see below summary table of test cases and applicable technology.

22 Table 6-13: EPS Fallback Test Case Selection summary

Applicable technology
NSA
LTE

SA
Test case Functional group

Test ID Voice service - EPS Fallback


6.4.1 EPS Fallback (N26 and without N26) Service Y N/A Y
23
24
25

26 6.4.1 EPS Fallback Test

27 This scenario tests the voice service when an end user device (UE) is in 5G SA coverage and performs an EPS fallback
28 to 4G to make or receive a voice call.

29 6.4.1.1 Test Description

30 This section tests the voice service on a 5G SA network when it uses EPS fallback mechanism to fallback to the LTE
31 network and use VoLTE to support voice service. This testing only applies to 5G SA deployment and in this scenario
32 the UE is connected to the 5G SA Core and registered with the IMS core for voice service. When the end user device
_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 132
O-RAN.TIFG.E2E-Test.0-v04.00

1 (UE) wants to make a voice call or receive a voice call, the network informs the device to fallback to the LTE network
2 to make/receive the voice call. The EPS fallback does increase the Call Setup Time due to the time needed to fallback
3 before making/receiving the voice call.

4 There are two mechanism in which EPS fallback is supported on the 5G core, and the methodology used impacts the
5 time taken to perform the fallback to LTE, and hence impacts the Call Setup Time. The two mechanisms that could be
6 used to perform EPS fallback is included below. These two mechanisms do change the test setup but does not change
7 the testing procedure.

8 • With N26 interface – The AMF in the 5G core communicates with the MME in the 4G over the N26 interface.
9 In this scenario the UE performs a handover from the 5G network to the 4G network.
10 • Without N26 interface – The AMF in the 5G core does not communicate directly with the 4G core, instead
11 uses the UDM/HSS to store and transfer relevant session information to the 4G core. In this scenario the UE
12 performs a Release with Redirect from the 5G network to the 4G network.

13 6.4.1.2 Test Setup

14 The SUT in this test case would be an O-eNB along with a gNB (O-RU, O-DU and O-CU-CP/O-CU-UP). We would
15 also need an interworking 4G-5G core (referred to as 4G-5G core going forward) which supports a combined anchor
16 point, i.e. SMF+PGW-C and UPF+PGW-U. The O-eNB connects to a 4G-5G core over the 4G interfaces like S1 to
17 provide 4G LTE service and the gNB (O-RU, O-DU and O-CU-CP/O-CU-UP) would connect to the 4G-5G core over
18 the 5G interfaces like N2 and N3 to provide 5G SA service. The O-eNB, gNB and the components within these have to
19 comply with the O-RAN specifications and support EPS fallback. The 4G-5G core will be required to support the basic
20 functionality to authenticate and register an end user device in order to setup a PDN connection/PDU session. The IMS
21 core will be required to register the end user device and needs to be integrated with the 4G-5G core to support voice
22 services over VoLTE and EPS fallback. The 4G-5G core have to interwork either using the N26 interface or without
23 using N26 interface depending on the desired core network configuration. The existence of the N26 interface does
24 reduce the EPS fallback time, hence reduce Call Setup Time and provide better end user experience. The 4G-5G and
25 IMS cores could be a completely emulated, partially emulated or real non-emulated cores. We will need at least two end
26 user devices (UE) which can be a real UEs or an emulated one, and both have to support voice service using VoLTE
27 and EPS fallback procedure. The end user devices will serve as Mobile Originated (MO) and Mobile Terminated (MT)
28 end user devices forming the two ends of the voice call. For the sake of clarity, we will address these end user devices
29 as UE-1 and UE-2 in this section. The test setup should include tools which have the ability to collect traces on the
30 elements and/or packet captures of communication between the elements. This could be a built-in capability of the
31 emulated/non-emulated network elements or an external tool. Optionally, if some of the network elements are located
32 remotely either in a cloud or on the internet, the additional latency should be calculated and accounted for.

33 The O-eNB, gNB and their components (O-RU, O-DU and O-CU-CP/O-CU-UP) need to have the right configuration
34 and software load. The end user device must be configured with the right user credentials to be able to register and
35 authenticate with the O-RAN system and the 4G-5G core. The end user devices also need to be provisioned with the
36 user credentials to attach to the 4G-5G core and register with the IMS core to perform voice call using VoLTE and EPS
37 fallback. The 4G-5G core network and IMS core must be configured to support voice service on the end user devices
38 used for testing, which includes dynamically setting up dedicated bearers for voice calls.

39 All the elements in the network like O-RAN system, 4G-5G core and the IMS Core need to have the ability to capture
40 traces to validate the successful execution of the test cases. The end user devices need to have the capability to capture
41 traces/packet capture to calculate the VoLTE KPIs. Optionally, the network could have network taps deployed in
42 various legs of the network to get packet captures to validate successful execution of the test cases. Finally, all these
43 different components need to have connectivity with each other – the end user device should be able to connect to O-
44 RAN system(O-eNBs and gNBs), O-RAN system needs to be connected to the 4G-5G core which in turn should have
45 connectivity to the IMS Core.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 133
O-RAN.TIFG.E2E-Test.0-v04.00

1 6.4.1.3 Test Methodology/Procedure

2 Ensure the end user devices, O-RAN system, 4G-5G core and the IMS Core have all been configured as outlined in
3 Section 6.4.1.2. In this test scenario, one of the end user device is going to be connected over the 4G O-eNB to the 4G-
4 5G core, while the other end user device is going to be connected over 5G gNB (O-RU, O-DU and O-CU-CP/O-CU-
5 UP) to the 4G-5G core. All traces and packet captures need to be enabled for the duration of the testing to ensure all
6 communication between network elements can be captured and validated.

7 1. Power on the two end user devices and ensure UE-1 is in LTE coverage, when UE-2 is in the 5G coverage.
8 Validate both end user devices register to the 4G-5G core.
9 2. Once the registration is complete, UE-1 and UE-2 have to establish a PDN connection and PDU session
10 respectively with the 4G-5G core. Once the PDN connection and PDU session have been setup, both the end user
11 devices have to register with the IMS core to support voice services.
12 3. Use UE-1 as the MO end user device to call UE-2. Validate the UE-2 performs EPS fallback to LTE to receive the
13 call.
14 4. Answer the call on UE-2 and continue to have two-way communication for at least 5 minutes and terminate the
15 call. Measure the voice KPIs included in Section 6.4.
16 5. At this point UE-2 should have completed the call and moved back to connect to the 4G-5G Core over 5G gNB.
17 6. Use UE-2 as the MO end user device to call UE-1. Validate the UE-2 performs EPS fallback to LTE to before
18 making the call.
19 7. Answer the call on UE-1 and continue to have two-way communication for at least 5 minutes and terminate the
20 call. Measure the voice KPIs included in Section 6.4.
21 8. Repeat the test multiple times (>10 times) and gather results.
22

23 6.4.1.4 Test Expectation (expected results)

24 As a pre-validation, use the traces to validate end user device UE-1 is able to perform a successful registration and PDN
25 connection setup while using the O-eNB in LTE coverage. Similarly, use the traces to validate end user device UE-2 is
26 able to perform a successful registration and PDU session setup while using gNB in 5G coverage. Also validate both the
27 end user devices are able to register with the IMS core for voice services. This is a prerequisite before these tests can be
28 validated.

29 Validate the end user devices are able to make a voice call between each other by dynamically setting up a QCI-1 bearer
30 to transfer voice packets over RTP. Validate the device which is in the 5G coverage falls back to 4G before receiving or
31 making a voice call, in other words the EPS fallback procedure has been executed successfully. The Call Setup Time
32 will be higher than VoLTE due to the delay associated with executing the EPS fallback procedure. Even though this test
33 case does not directly impact the quality of the voice call, for the sake of consistency, ensure the voice quality from both
34 parties are clear and audible without one-way or intermittent muting. Use the packet captures to validate there is no
35 RTP packet drops or high RTP packet jitter which could cause voice muting issues. Use the packet captures to ensure
36 there are no out-of-sequence packets which could impact customer’s voice experience.

37 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
38 is performed in a controlled environment in good radio condition without the interference of external factors which
39 could impact the KPIs, example: use of internet to connect to remote network nodes could add latency, jitter and packet
40 loss issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the testing in
41 this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results outside
42 the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues may be
43 due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and the test
44 has to be repeated.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 134
O-RAN.TIFG.E2E-Test.0-v04.00

1 Table 6-14 Typical Voice KPI values in controlled environments with good radio conditions

KPI With N26 Interface Without N26 Interface

CSSR – Call Setup Success Rate % >99% >99%

CST – Call Setup Time 3.5s 4s

MOS Score 3.5 3.5

Mute Rate % < 1% < 1%

One Way Call % < 1% < 1%

RTP Packet Loss % < 1% < 1%


2

3 These end user voice KPI values included in Section 6.3 need to be included in the test report along with the minimum
4 configuration parameters included in Section 3.3. The following information should also be included in the test report to
5 provide a comprehensive view of the test setup.

6 End user device side (real or emulated UE):

7 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

8 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

9 • Downlink transmission mode

10 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
11 (average sample per second)

12 The table below gives an example of the test report considering the mean and standard deviation of all test results that
13 have been captured.

14 Table 6-15 Example Test Report for Voice Service Testing – EPS Fallback

VoLTE MO VoLTE MT

Call Setup Success Rate


Call Setup time
MOS Score
Mute Rate
One Way Call
RTP Packet Loss
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 135
O-RAN.TIFG.E2E-Test.0-v04.00

UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
1 The end user VoLTE experience can also be impacted by some of the features available (see Section 6.3) on the O-eNB,
2 O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other features/functionality which
3 could impact the end user’s voice service experience should be included in the test report to provide a comprehensive
4 view of the testing.

5 6.5 Voice Service – Voice over NR (VoNR)


6 Voice services on 5G are called Voice over New Radio (VoNR). VoNR also packetizes voice and uses IP packets for
7 voice communication. On the Core and IMS side, VoNR is very similar to VoLTE. VoNR also uses IMS system for
8 voice service, and the IMS interacts with the 5G core to setup separate bearer for voice service. This section tests the
9 voice service over VoNR in different scenarios. The KPIs which will be monitored to assess the voice service are
10 included below

11 • CSSR – Call Setup Success Rate % – Total number of calls which were successful by the total number of
12 calls made as a percentage.
13 • CST – Call Setup Time – Time taken from the initial SIP INVITE to when the SIP 180 Ringing is received
14 in seconds.
15 • MOS Score – Mean Opinion Score for the voice call. This needs to be measured on both ends of the Voice
16 call – mobile originated and mobile terminated.
17 • Mute Rate % – This is the percentage of calls which were muted in both directions (calls with RTP loss of
18 > 3-4s in both directions are considered muted call). This needs to be measured on both ends of the Voice
19 call – mobile originated and mobile terminated and counted only once per call.
20 • One Way Calls % – This is the percentage of calls which were muted in any one direction (calls with RTP
21 loss of > 3-4s in one direction only are considered one-way calls). This needs to be monitored on both ends
22 of the Voice call – mobile originated and mobile terminated and counted only once per call.
23 • RTP Packet Loss % - Number of RTP packets which were dropped/lost in uplink/downlink direction as a
24 percentage of total packets. This needs to be measured on both ends of the Voice call – mobile originated
25 and mobile terminated.
26

27 End user voice service experience can also be impacted by some of the features available on the O-RU, O-DU and O-
28 CU-CP/O-CU-UP. Some of these features have been included below. As a part of the voice services testing, details of
29 these features need to be included in the test report to get a comprehensive view of the setup used for testing. If
30 additional features or functionalities have been enabled during this testing that impact the end user voice experience,
31 those need be included in the test report as well.

32 • Basic Voice over NR


33 • RLC in Unacknowledged Mode
34 • Robust Header Compression
35 • DRX
36 • Dynamic GBR Admission Control
37 • TTI Bundling
38 • Automated Neighbor Relations ANR

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 136
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Connected mode mobility (Handover)


2 • idle mode reselection
3 • Intra-frequency Cell Reselection/Handover
4 • Inter-frequency Redirection/Cell Reselection/Handover
5 • NR Coverage-Triggered NR Session Continuity
6
7 Please see below summary table of test cases and applicable technology.

8 Table 6-16: VoNR Test Case summary

Applicable technology

NSA
LTE

SA
Test case Functional group

Test ID Voice service - Voice over NR (VoNR)


6.5.1 Voice over NR Test – stationary Service N/A N/A Y
6.5.2 VONR – Intra-Distributed Unit (DU) handover Service N/A N/A Y
VONR – Intra-Central Unit (CU) Inter-Distributed Unit Service N/A N/A Y
6.5.3
(DU) handover
6.5.4 VONR – Inter-Central Unit (CU) handover Service N/A N/A Y
9

10 6.5.1 Voice over NR Test

11 This test scenario validates the user’s voice experience when the end user device is in 5G coverage and the user makes a
12 voice call over NR – VoNR.

13 6.5.1.1 Test Description

14 This section tests VoNR user experience on an O-RAN system. VoNR is similar to VoLTE where it uses IP packets to
15 send and receive voice packets, with the IP packets being transferred over NR or 5G radio. Just like VoLTE, IMS is
16 used to setup the control and data plane for the voice communication. As voice service is latency sensitive, the 5G core
17 interacts with the IMS core to setup different QoS flows for voice traffic – 5QI-5 for VoNR control plane and 5QI-1 for
18 VoNR data plane. This test case is applicable when end user device (UE) is connected over a 5G SA network.

19 6.5.1.2 Test Setup

20 The SUT in this test case would be a gNB (O-RU, O-DU and O-CU-CP/O-CU-UP) which is used to test the voice
21 service. A 5G SA Core would be required to support basic functionality to authenticate and register the end user device
22 to establish a PDU session. An IMS core will be required to register the end user device to support voice services on a
23 5G network. The 5G and IMS cores could be a completely emulated, partially emulated or real non-emulated cores. We
24 will need at least two end user devices (UE) which can be a real UEs or an emulated one, and both have to support
25 voice service using VoNR. The end user devices will serve as Mobile Originated (MO) and Mobile Terminated (MT)
26 end user devices forming the two ends of the voice call. Going forward in this section, these end user devices will be
27 referred to as MO end user device and MT end user device to represent their role in the voice call. The test setup should
28 include tools which have the ability to collect traces on the elements and/or packet captures of communication between
29 the elements. This could be a built-in capability of the emulated/non-emulated network elements or an external tool.
30 Optionally, if some of the network elements are located remotely either in a cloud or on the internet, the additional
31 latency should be calculated and accounted for.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 137
O-RAN.TIFG.E2E-Test.0-v04.00

1 The O-RU, O-DU and O-CU-CP/O-CU-UP need to have the right configuration and software load. The SUT will also
2 need to be setup to run this testing in different radio conditions as outlined in Section 3.6. The end user device must be
3 configured with the right user credentials to be able to register and authenticate with the O-RAN system and the 5G
4 core. The end user devices also need to be provisioned with the user credentials to register and setup PDU session with
5 the 5G core and register with the IMS core to perform voice call using VoNR. The 5G core network and IMS core must
6 be configured to support voice service on the end user devices used for testing, which includes the ability to
7 dynamically set up QoS Flows for voice calls. The locations where the radio conditions are excellent, good, fair and
8 poor need to be identified within the serving cell.

9 All the elements in the network like O-RAN system, 5G core and the IMS Core need to have the ability to capture
10 traces to validate the successful execution of the test cases. The end user devices need to have the capability to capture
11 traces/packet capture to calculate the VoNR KPIs. Optionally, the network could have network taps deployed in various
12 legs of the network to get packet captures to validate successful execution of the test cases. Finally, all these different
13 components need to have connectivity with each other – the end user device should be able to connect to O-RAN
14 system(O-RU, O-DU and O-CU-CP/O-CU-UP), O-RAN system needs to be connected to the 5G core which in turn
15 should have connectivity to the IMS Core.

16 6.5.1.3 Test Methodology/Procedure

17 Ensure the end user device, O-RAN system, 5G core and the IMS core have all been configured as outlined in Section
18 6.5.1.2. In this test scenario, both the mobile originated and mobile terminated end user devices are going to use the
19 same O-RAN system, 5G and IMS core to perform the end-to-end voice call. All traces and packet captures need to be
20 enabled for the duration of the testing to ensure all communication between network elements can be captured and
21 validated.

22 1. Power on the two end user devices in excellent radio condition and ensure both of devices registers with the 5G
23 core for voice services over SA by connecting over the O-RAN (O-RU, O-DU and O-CU-CP/O-CU-UP).
24 2. Once the registration is complete, the MO and MT end user devices have to establish a PDU session with the 5G
25 core. Once the PDU session has been setup, both the end user devices have to register with the IMS core to support
26 voice services.
27 3. Use the MO end user device to call the MT end user device. Validate the MT end user device is able to receive and
28 answer the call.
29 4. Continue to have two-way voice communication on the voice call for at least 5 minutes before terminating it.
30 5. Repeat the test multiple times (> 10 times) and gather results.
31 6. Repeat the above steps 1 through 5 for the good, fair and poor radio conditions.
32

33 6.5.1.4 Test Expectation (expected results)

34 As a pre-validation, use the traces to validate a successful registration and PDU session setup by the end user devices
35 without any errors over O-RU, O-DU and O-CU-CP/O-CU-UP. Also validate both the end user devices are able to
36 register with the IMS core for voice services. This is a prerequisite before these tests can be validated.

37 Validate the end user devices are able to make a voice call between each other by dynamically setting up a 5QI-1 bearer
38 to transfer voice packets over RTP. Ensure the Call Setup Time is reasonable, the voice quality from both parties are
39 clear and audible without one-way or intermittent muting. Use the packet captures to validate there is no RTP packet
40 drops or high RTP packet jitter which could cause voice muting issues. Use the packet captures to ensure there are no
41 out-of-sequence packets which could impact customer’s voice experience.

42 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
43 is performed in a controlled environment in good radio condition without the interference of external factors which
44 could impact the KPIs, example: use of internet to connect to remote servers/hosts could add latency, jitter and

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 138
O-RAN.TIFG.E2E-Test.0-v04.00

1 reliability issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the
2 testing in this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results
3 outside the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues
4 may be due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and
5 the test has to be repeated.

6 • CSSR – Call Setup Success Rate % –> 99%.


7 • CST – Call Setup Time – < 2.5s
8 • MOS Score – > 3.5
9 • Mute Rate % – < 1%
10 • One Way Call % - < 1%
11 • RTP Packet Loss % - < 1%
12

13 As a part of gathering data, ensure the minimum configuration parameters (see Section 3.3) are included in the test
14 report. The following information should also be included in the test report to provide a comprehensive view of the test
15 setup.

16 UE side (real or emulated UE):

17 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

18 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

19 • Received downlink throughput (L1 and L3 PDCP layers) (average sample per second)

20 • Downlink transmission mode

21 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
22 (average sample per second)

23 Table 6-17 Example Test Report for VoNR – Stationary Testing

Excellent Poor
Good Fair
(cell centre) (cell edge)
VoNR MO/MT VoNR MO/MT VoNR MO/MT VoNR MO/MT
Call Setup Success Rate
Call Setup Time
MOS Score
Mute Rate %
One Way Call %
RTP Packet Loss %
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 139
O-RAN.TIFG.E2E-Test.0-v04.00

UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
1 The end user voice service experience can also be impacted by some of the features available (see Section 6.5) on the
2 O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other features/functionality which
3 could impact the end user’s voice service experience should be included in the test report to provide a comprehensive
4 view of the testing.

5 6.5.2 VoNR – Intra-Distributed Unit (O-DU) handover

6 This test scenario validates the user’s voice experience when the end user device (UE) is on a VoNR call and performs a
7 handover between two O-RUs which connect to the same O-DU (Intra-O-DU handover)

8 6.5.2.1 Test Description

9 The 5G O-RAN system has multiple sub-components like the O-RU, O-DU and the O-CU-CP/O-CU-UP. This setup
10 leads to multiple handover scenarios between the O-RAN sub-components. This particular scenario tests the end user’s
11 voice experience during the handover between two O-RUs which are connected to the same O-DU (and O-CU-CP/O-
12 CU-UP), hence an Intra-O-DU handover. This handover will be agnostic to the 5G core as the handover occurs on the
13 O-RAN system. This test assesses the impact on the end user’s voice service in this handover scenario by monitoring
14 the KPIs included in Section 6.5.

15 6.5.2.2 Test Setup

16 The SUT in this test case would be a pair of O-RUs which connect to the same O-DU and O-CU-CP/O-CU-UP (refer
17 Section 4.4). This O-RAN setup is used to test the voice service during a handover. A 5G SA Core would be required to
18 support basic functionality to authenticate and register the end user device to establish a PDU session. An IMS core will
19 be required to register the end user device to support voice services on a 5G network. The 5G and IMS cores could be a
20 completely emulated, partially emulated or real non-emulated cores. We will need at least two end user devices (UE)
21 which can be a real UEs or an emulated one, and both have to support voice service using 5G – Voice over NR(VoNR).
22 The end user devices will serve as Mobile Originated (MO) and Mobile Terminated (MT) end user devices forming the
23 two ends of the voice call. Going forward in this section, these end user devices will be referred to as MO end user
24 device and MT end user device to represent their role in the voice call. The test setup should include tools which have
25 the ability to collect traces on the elements and/or packet captures of communication between the elements. This could
26 be a built-in capability of the emulated/non-emulated network elements or an external tool. Optionally, if some of the
27 network elements are located remotely either in a cloud or on the internet, the additional latency should be calculated
28 and accounted for.

29 The pair of O-RUs (O-RU1 and O-RU2) need to be connected to the O-DU and O-CU-CP/O-CU-UP and all the
30 components need to have the right configuration and software load. The end user devices must be configured with the
31 right user credentials to be able to register and authenticate with the O-RAN system and the 5G core. The end user
32 devices also need to be provisioned with the user credentials to register and setup PDU session with the 5G core and
33 register with the IMS core to perform voice call using VoNR. The 5G core network and IMS core must be configured to
34 support voice service on the end user devices used for testing, which includes the ability to dynamically set up QoS
35 Flows for voice calls.

36 All the elements in the network like O-RAN system, 5G core and the IMS Core need to have the ability to capture
37 traces to validate the successful execution of the test cases. The end user devices need to have the capability to capture
38 traces/packet capture to calculate the VoNR KPIs. Optionally, the network could have network taps deployed in various

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 140
O-RAN.TIFG.E2E-Test.0-v04.00

1 legs of the network to get packet captures to validate successful execution of the test cases. Finally, all these different
2 components need to have connectivity with each other – the end user device should be able to connect to O-RAN
3 system(O-RUs connected to O-DU and O-CU-CP/O-CU-UP), O-RAN system needs to be connected to the 5G core
4 which in turn should have connectivity to the IMS Core.

5 6.5.2.3 Test Methodology/Procedure

6 Ensure the end user devices, O-RAN system, 5G core and the IMS Core have all been configured as outlined in Section
7 6.5.2.2. In this test scenario, both the mobile originated and mobile terminated end user devices are going to use the
8 same O-RAN system, 5G and IMS core to perform the end-to-end voice call. All traces and packet captures need to be
9 enabled for the duration of the testing to ensure all communication between network elements can be captured and
10 validated.

11 1. Power on the two end user devices and ensure both of the devices registers with the 5G core for voice services over
12 SA by connecting over the O-RAN (O-RU, O-DU and O-CU-CP/O-CU-UP). Ensure both the MO & MT end user
13 devices are in the coverage area of the same O-RU – O-RU1.
14 2. Once the registration is complete, the MO and MT end user devices have to establish PDU session with the 5G
15 core. Once the PDU session has been setup, both the end user devices have to register with the IMS core to support
16 voice services.
17 3. Use the MO end user device to call the MT end user device. Validate the MT end user device is able to receive and
18 answer the call.
19 4. Once the call has been setup, move the MO end user device from the coverage area of O-RU1 to coverage area of
20 O-RU2, thus causing a handover from O-RU1 to O-RU2.
21 5. Continue the two-way voice communication between MO and MT end user devices until the handover procedure is
22 complete before terminating the voice call.
23 6. Repeat the test (steps 1 through 5) multiple times (> 10 times) and gather results.
24

25 6.5.2.4 Test Expectation (expected results)

26 As a pre-validation, use the traces to validate a successful registration and PDU session setup by the end user devices
27 without any errors over O-RU, O-DU and O-CU-CP/O-CU-UP. Also validate both the end user devices are able to
28 register with the IMS core for voice services. This is a prerequisite before these tests can be validated.

29 Validate the end user devices are able to make a voice call between each other by dynamically setting up a 5QI-1 bearer
30 to transfer voice packets over RTP. Ensure the Call Setup Time is reasonable, the voice quality from both parties are
31 clear and audible without one-way or intermittent muting through the duration of the call, especially during the
32 handover process. Use the packet captures to validate there is no RTP packet drops or high RTP packet jitter which
33 could cause voice muting issues. Use the packet captures to ensure there are no out-of-sequence packets which could
34 impact customer’s voice experience.

35 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
36 is performed in a controlled environment in good radio condition without the interference of external factors which
37 could impact the KPIs, example: use of internet to connect to remote servers/hosts could add latency, jitter and
38 reliability issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the
39 testing in this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results
40 outside the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues
41 may be due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and
42 the test has to be repeated.

43 • CSSR – Call Setup Success Rate % –> 99%.


44 • CST – Call Setup Time – < 2.5s

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 141
O-RAN.TIFG.E2E-Test.0-v04.00

1 • MOS Score – > 3.5


2 • Mute Rate % – < 1%
3 • One Way Call % - < 1%
4 • RTP Packet Loss % - < 1%
5

6 As a part of gathering data, ensure the minimum configuration parameters (see Section 3.3) are included in the test
7 report. The following information should also be included in the test report to provide a comprehensive view of the test
8 setup.

9 UE side (real or emulated UE):

10 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

11 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

12 • Received downlink throughput (L1 and L3 PDCP layers) (average sample per second)

13 • Downlink transmission mode

14 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
15 (average sample per second)

16 Table 6-18 Example Test Report for VoNR – Intra-O-DU Handover

VoNR MO VoNR MT
Call Setup Success Rate
Call Setup Time
MOS Score
Mute Rate %
One Way Call %
RTP Packet Loss %
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
17 The end user voice service experience can also be impacted by some of the features available (see Section 6.5) on the
18 O-RUs, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other features/functionality which
19 could impact the end user’s voice service experience should be included in the test report to provide a comprehensive
20 view of the testing.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 142
O-RAN.TIFG.E2E-Test.0-v04.00

1 6.5.3 VoNR – Intra-Central Unit (O-CU) Inter-Distributed Unit (O-DU)


2 handover

3 This test scenario validates the user’s voice experience when the end user device (UE) is on a VoNR call and performs a
4 handover between O-RUs which connect to different O-DUs, which in turn are connected to the same O-CU-CP/O-CU-
5 UP (Intra-O-CU Inter-O-DU handover)

6 6.5.3.1 Test Description

7 The 5G O-RAN system has multiple sub-components like the O-RU, O-DU and the O-CU-CP/O-CU-UP. This setup
8 leads to multiple handover scenarios between the O-RAN sub-components. This particular scenario tests the end user’s
9 voice experience during the handover between two O-RUs, where the O-RUs are connected to different O-DUs which
10 in turn are connected to the same O-CU-CP/O-CU-UP, hence an Intra-O-CU Inter-O-DU handover. This handover will
11 be agnostic to the 5G core as the handover occurs on the O-RAN system. This test assesses the impact on the end user’s
12 voice service in this handover scenario by monitoring the KPIs included in Section 6.5.

13 6.5.3.2 Test Setup

14 The SUT in this test case would be a pair of O-RUs which connect to a different pair of O-DUs which in turn connect to
15 the same O-CU (refer Section 4.5). This O-RAN setup is used to test the voice service during a handover. A 5G SA
16 Core would be required to support basic functionality to authenticate and register the end user device to establish a PDU
17 session. An IMS core will be required to register the end user device to support voice services on a 5G network. The 5G
18 and IMS cores could be a completely emulated, partially emulated or real non-emulated cores. We will need at least two
19 end user devices (UE) which can be a real UEs or an emulated one, and both have to support voice service using VoNR.
20 The end user devices will serve as Mobile Originated (MO) and Mobile Terminated (MT) end user devices forming the
21 two ends of the voice call. Going forward in this section, these end user devices will be referred to as MO end user
22 device and MT end user device to represent their role in the voice call. The test setup should include tools which have
23 the ability to collect traces on the elements and/or packet captures of communication between the elements. This could
24 be a built-in capability of the emulated/non-emulated network elements or an external tool. Optionally, if some of the
25 network elements are located remotely either in a cloud or on the internet, the additional latency should be calculated
26 and accounted for.

27 The pair of O-RUs (O-RU1 and O-RU2) need to be connected to a pair of O-DUs (O-DU1 and O-DU2) and O-CU-
28 CP/O-CU-UP. As for the O-RU to O-DU connection, ensure O-RU1 connects to O-DU1, and O-RU2 connects to O-
29 DU2, and both the O-DUs connect to the same O-CU-CP/O-CU-UP. All the O-RAN components need to have the right
30 configuration and software load. The end user devices also need to be provisioned with the user credentials to register
31 and setup PDU session with the 5G core and register with the IMS core to perform voice call using VoNR. The 5G core
32 network and IMS core must be configured to support voice service on the end user devices used for testing, which
33 includes the ability to dynamically set up QoS Flows for voice calls.

34 All the elements in the network like O-RAN system, 5G core and the IMS Core need to have the ability to capture
35 traces to validate the successful execution of the test cases. The end user devices need to have the capability to capture
36 traces/packet capture to calculate the VoNR KPIs. Optionally, the network could have network taps deployed in various
37 legs of the network to get packet captures to validate successful execution of the test cases. Finally, all these different
38 components need to have connectivity with each other – the end user device should be able to connect to O-RAN
39 system(O-RUs connected to O-DU and O-CU), O-RAN system needs to be connected to the 5G core which in turn
40 should have connectivity to the IMS Core.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 143
O-RAN.TIFG.E2E-Test.0-v04.00

1 6.5.3.3 Test Methodology/Procedure

2 Ensure the end user devices, O-RAN system, 5G core and the IMS core have all been configured as outlined in Section
3 6.5.3.2. In this test scenario, both the mobile originated and mobile terminated end user devices are going to use the
4 same O-RAN system, 5G and IMS core to perform the end-to-end voice call. All traces and packet captures need to be
5 enabled for the duration of the testing to ensure all communication between network elements can be captured and
6 validated.

7 1. Power on the two end user devices and ensure both of devices registers with the 5G core for voice services over SA
8 by connecting over the O-RAN (O-RU, O-DU and O-CU). Ensure both the MO & MT end user devices are in the
9 coverage area of the same O-RU – O-RU1.
10 2. Once the registration is complete, the MO and MT end user devices have to establish a PDU session with the 5G
11 core. Once the PDU session has been setup, both the end user devices have to register with the IMS core to support
12 voice services.
13 3. Use the MO end user device to call the MT end user device. Validate the MT end user device is able to receive and
14 answer the call.
15 4. Once the call has been setup, move the MO end user device from the coverage area of O-RU1 to coverage area of
16 O-RU2, thus causing a handover from O-RU1 to O-RU2, and O-DU1 to O-DU2.
17 5. Continue the two-way voice communication between the end user devices until the handover procedure is complete
18 before terminating the voice call.
19 6. Repeat the test (steps 1 through 5) multiple times (> 10 times) and gather results.
20

21 6.5.3.4 Test Expectation (expected results)

22 As a pre-validation, use the traces to validate a successful registration and PDU session setup by the end user devices
23 without any errors over O-RU, O-DU and O-CU. Also validate both the end user devices are able to register with the
24 IMS core for voice services. This is a prerequisite before these tests can be validated.

25 Validate the end user devices are able to make a voice call between each other by dynamically setting up a 5QI-1 bearer
26 to transfer voice packets over RTP. Ensure the Call Setup Time is reasonable, the voice quality from both parties are
27 clear and audible without one-way or intermittent muting through the duration of the call, especially during the
28 handover process. Use the packet captures to validate there is no RTP packet drops or high RTP packet jitter which
29 could cause voice muting issues. Use the packet captures to ensure there are no out-of-sequence packets which could
30 impact customer’s voice experience.

31 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
32 is performed in a controlled environment in good radio condition without the interference of external factors which
33 could impact the KPIs, example: use of internet to connect to remote servers/hosts could add latency, jitter and
34 reliability issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the
35 testing in this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results
36 outside the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues
37 may be due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and
38 the test has to be repeated.

39 • CSSR – Call Setup Success Rate % –> 99%.


40 • CST – Call Setup Time – < 2.5s
41 • MOS Score – > 3.5
42 • Mute Rate % – < 1%
43 • One Way Call % - < 1%
44 • RTP Packet Loss % - < 1%
45

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 144
O-RAN.TIFG.E2E-Test.0-v04.00

1 As a part of gathering data, ensure the minimum configuration parameters (see Section 3.3) are included in the test
2 report. The following information should also be included in the test report to provide a comprehensive view of the test
3 setup.

4 UE side (real or emulated UE):

5 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

6 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

7 • Downlink transmission mode

8 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
9 (average sample per second)

10 Table 6-19 Example Test Report for VoNR – Inter-O-DU Intra-O-CU Handover

VoNR MO VoNR MT
Call Setup Success Rate
Call Setup Time
MOS Score
Mute Rate %
One Way Call %
RTP Packet Loss %
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
11 The end user voice service experience can also be impacted by some of the features available (see Section 6.5) on the
12 O-RUs, O-DU and O-CU. The details of these features, along with any other features/functionality which could impact
13 the end user’s voice service experience should be included in the test report to provide a comprehensive view of the
14 testing.

15 6.5.4 VoNR – Inter-Central Unit (O-CU) handover

16 This test scenario validates the user’s voice experience when the end user device (UE) is on a VoNR call and performs a
17 handover between O-RUs which connect to different O-DUs and different O-CUs (Inter-O-CU Inter-O-DU handover)

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 145
O-RAN.TIFG.E2E-Test.0-v04.00

1 6.5.4.1 Test Description

2 The 5G O-RAN system has multiple sub-components like the O-RU, O-DU and the O-CU. This setup leads to multiple
3 handover scenarios between the O-RAN sub-components. This particular scenario tests the end user’s voice experience
4 during the handover between two O-RUs, where the O-RUs are connected to different O-DUs which in turn are
5 connected to different O-CUs, hence an Inter-O-CU Inter-O-DU handover. This handover occurs on the O-RAN system
6 and the 5G core is aware of the handover as it needs to send data to a new O-CU as a part of the handover process. This
7 test assesses the impact on the end user’s voice service in this handover scenario by monitoring the KPIs included in
8 Section 6.5.

9 6.5.4.2 Test Setup

10 The SUT in this test case would be a pair of O-RAN subsystems, a set of O-RU, O-DU and O-CU which interconnects
11 with another set of O-RU, O-DU and O-CU (refer Section 4.6). This O-RAN setup is used to test the voice service
12 during a handover. A 5G SA Core would be required to support basic functionality to authenticate and register the end
13 user device to establish a PDU session. An IMS core will be required to register the end user device to support voice
14 services on a 5G network. The 5G and IMS cores could be a completely emulated, partially emulated or real non-
15 emulated cores. We will need at least two end user devices (UE) which can be a real UEs or an emulated one, and both
16 have to support voice service using VoNR. The end user devices will serve as Mobile Originated (MO) and Mobile
17 Terminated (MT) end user devices forming the two ends of the voice call. Going forward in this section, these end user
18 devices will be referred to as MO end user device and MT end user device to represent their role in the voice call. The
19 test setup should include tools which have the ability to collect traces on the elements and/or packet captures of
20 communication between the elements. This could be a built-in capability of the emulated/non-emulated network
21 elements or an external tool. Optionally, if some of the network elements are located remotely either in a cloud or on the
22 internet, the additional latency should be calculated and accounted for.

23 The pair of O-RAN (O-RAN1 and O-RAN2) subsystems will be connected – O-RU1 will be connected to O-DU1,
24 which in turn will be connected to O-CU1 and similarly O-RU2 will be connected to O-DU2, which in turn will be
25 connected to O-CU2. The O-CU1 and O-CU2 nodes will be connected to each other and the 5G core. All the O-RAN
26 components need to have the right configuration and software load. The end user devices must be configured with the
27 right user credentials to be able to register and authenticate with the O-RAN system and the 5G core. The end user
28 devices also need to be provisioned with the user credentials to register and setup PDU session with the 5G core and
29 register with the IMS core to perform voice call using VoNR. The 5G core network and IMS core must be configured to
30 support voice service on the end user devices used for testing, which includes the ability to dynamically set up QoS
31 Flows for voice calls.

32 All the elements in the network like O-RAN system, 5G core and the IMS Core need to have the ability to capture
33 traces to validate the successful execution of the test cases. The end user devices need to have the capability to capture
34 traces/packet capture to calculate the VoNR KPIs. Optionally, the network could have network taps deployed in various
35 legs of the network to get packet captures to validate successful execution of the test cases. Finally, all these different
36 components need to have connectivity with each other – the end user device should be able to connect to O-RAN
37 system(O-RUs connected to O-DUs which are connected to the O-CUs), O-RAN system needs to be connected to the
38 5G core which in turn should have connectivity to the IMS Core.

39 6.5.4.3 Test Methodology/Procedure

40 Ensure the end user devices, O-RAN system, 5G core and the IMS Core have all been configured as outlined in Section
41 6.5.4.2. In this test scenario, both the mobile originated and mobile terminated end user devices are going to use the
42 same O-RAN system, 5G and IMS core to perform the end-to-end voice call. All traces and packet captures need to be
43 enabled for the duration of the testing to ensure all communication between network elements can be captured and
44 validated.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 146
O-RAN.TIFG.E2E-Test.0-v04.00

1 1. Power on the two end user devices and ensure both of devices registers with the 5G core for voice services over SA
2 by connecting over the O-RAN1 system (O-RU1, O-DU1 and O-CU1). Ensure both the MO & MT end user
3 devices are in the coverage area of the same O-RU – O-RU1.
4 2. Once the registration is complete, the MO and MT end user devices have to establish a PDU session with the 5G
5 core. Once the PDU session has been setup, both the end user devices have to register with the IMS core to support
6 voice services.
7 3. Use the MO end user device to call the MT end user device. Validate the MT end user device is able to receive and
8 answer the call.
9 4. Once the call has been setup, move the MO end user device from the coverage area of O-RU1 to coverage area of
10 O-RU2, thus causing a handover from O-RAN1 to O-RAN2 subsystem - O-RU1 to O-RU2, O-DU1 to O-DU2 and
11 O-CU1 to O-CU2.
12 5. Continue the two-way voice communication between MO and MT end user devices until the handover procedure is
13 complete before terminating the voice call.
14 6. Repeat the test (steps 1 through 5) multiple times (> 10 times) and gather results.
15

16 6.5.4.4 Test Expectation (expected results)

17 As a pre-validation, use the traces to validate a successful registration and PDU session setup by the end user devices
18 without any errors over O-RU, O-DU and O-CU. Also validate both the end user devices are able to register with the
19 IMS core for voice services. This is a prerequisite before these tests can be validated.

20 Validate the end user devices are able to make a voice call between each other by dynamically setting up a 5QI-1 bearer
21 to transfer voice packets over RTP. Ensure the Call Setup Time is reasonable, the voice quality from both parties are
22 clear and audible without one-way or intermittent muting through the duration of the call, especially during the
23 handover process. Use the packet captures to validate there is no RTP packet drops or high RTP packet jitter which
24 could cause voice muting issues. Use the packet captures to ensure there are no out-of-sequence packets which could
25 impact customer’s voice experience.

26 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
27 is performed in a controlled environment in good radio condition without the interference of external factors which
28 could impact the KPIs, example: use of internet to connect to remote servers/hosts could add latency, jitter and
29 reliability issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the
30 testing in this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results
31 outside the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues
32 may be due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and
33 the test has to be repeated.

34 • CSSR – Call Setup Success Rate % –> 99%.


35 • CST – Call Setup Time – < 2.5s
36 • MOS Score – > 3.5
37 • Mute Rate % – < 1%
38 • One Way Call % - < 1%
39 • RTP Packet Loss % - < 1%
40

41 As a part of gathering data, ensure the minimum configuration parameters (see Section 3.3) are included in the test
42 report. The following information should also be included in the test report to provide a comprehensive view of the test
43 setup.

44 UE side (real or emulated UE):

45 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 147
O-RAN.TIFG.E2E-Test.0-v04.00

1 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

2 • Received downlink throughput (L1 and L3 PDCP layers) (average sample per second)

3 • Downlink transmission mode

4 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
5 (average sample per second)

6 Table 6-20 Example Test Report for VoNR – Inter-O-CU Handover

VoNR MO VoNR MT
Call Setup Success Rate
Call Setup Time
MOS Score
Mute Rate %
One Way Call %
RTP Packet Loss %
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
7 The end user voice service experience can also be impacted by some of the features available (see Section 6.5) on the
8 O-RUs, O-DU and O-CU. The details of these features, along with any other features/functionality which could impact
9 the end user’s voice service experience should be included in the test report to provide a comprehensive view of the
10 testing.

11 6.6 Video Service – Video over LTE (ViLTE)


12 Voice service has been one of the basic services provided on the mobile device since launch. With increasing LTE
13 penetration and the availability of higher speeds, video calls are slowly replacing voice calls. With the launch of 5G
14 which promises much higher throughput, the shift towards using video calls is only going to get faster. This section of
15 the document validates the user’s video calling experience in different conditions on the LTE network. This section of
16 the document only applies to the video calling service provided by the telecom service provider, which uses the telecom
17 operator’s IMS core to establish dedicated bearer to provide superior video calling experience. The KPIs which will be
18 monitored to assess the voice service are included below

19 • CSSR – Call Setup Success Rate % – Total number of calls which were successful by the total number of
20 calls made as a percentage.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 148
O-RAN.TIFG.E2E-Test.0-v04.00

1 • CST – Call Setup Time – Time taken from the initial SIP INVITE to when the SIP 180 Ringing is received
2 in seconds.
3 • MOS Score – Mean Opinion Score for the video call. This needs to be measured on both ends of the video
4 call – mobile originated and mobile terminated.
5 • Mute Rate % – This is the percentage of video calls which were muted or video freezes in both directions
6 (calls with RTP loss of > 3-4s in both directions are considered muted call). This needs to be measured on
7 both ends of the video call – mobile originated and mobile terminated and counted only once per call.
8 • One Way Calls % – This is the percentage of video calls which were muted, or video is not transmitted in
9 any one direction only (video calls with RTP loss of > 3-4s in one direction only are considered one-way
10 calls). This needs to be monitored on both ends of the Voice call – mobile originated and mobile terminated
11 and counted only once per call.
12 • RTP Packet Loss % - Number of RTP packets which were dropped/lost in uplink/downlink direction as a
13 percentage of total packets. This needs to be measured on both ends of the Voice call – mobile originated
14 and mobile terminated.
15

16 Along with the monitoring and validation of these services using user experience KPIs, the O-RAN systems also need
17 to be monitored. The end user service experience can also be impacted by some of the features available on the O-RAN
18 system. Some of these features have been included below. As a part of the video calling services testing, details of these
19 features need to be included in the test report to get a comprehensive view of the setup used for testing. If additional
20 features or functionalities have been enabled during this testing that impact the end user voice experience, those need be
21 included in the test report as well.

22 • RLC in Unacknowledged Mode


23 • Robust Header Compression
24 • DRX
25 • Dynamic GBR Admission Control
26 • TTI Bundling
27 • VoLTE Inactivity Timer
28 • Frequency Hopping
29 • Multi-Target RRC Connection Re-Establishment
30 • VoLTE HARQ
31 • Coordinated Multi Point (DL and UL)
32 • VoLTE Quality Enhancement
33 • Packet Loss Detection
34 • Voice Codec Aware scheduler
35 • NR to LTE PS Redirection/Cell Reselection/Handover
36 • LTE to NR PS Redirection/Cell Reselection/Handover
37 Please see below summary table of test cases and applicable technology.
38 Table 6-21: ViLTE Test Case summary

Applicable technology
NSA
LTE

SA

Test case Functional group

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 149
O-RAN.TIFG.E2E-Test.0-v04.00

Test ID Video service- Video over LTE (ViLTE)


6.6.1 ViLTE Stationary Test Service Y N/A N/A
6.6.2 ViLTE handover Test (intra) Service Y N/A N/A
6.6.3 Video Service – LTE to NR / NR to LTE handover test Service Y N/A Y
1
2

3 6.6.1 ViLTE Stationary Test

4 This test scenario validates the user’s video calling experience when the end user device (UE) is in LTE coverage and
5 performs a video call.

6 6.6.1.1 Test Description

7 Telecom service providers are providing video calling service to their customers and it has been gaining popularity
8 replacing voice calls. Video over LTE (ViLTE) uses the same mechanism as VoLTE, i.e. IP packets to send and receive
9 video & audio packets, with the IP packets being transferred over LTE. Just like VoLTE, IMS is used to setup control
10 and data plane needed for ViLTE communication. As two-way video service is latency sensitive, the EPC core interacts
11 with the IMS core to setup different bearers for Video traffic – QCI-5 for ViLTE control plane and QCI-2 for ViLTE
12 data plane. This test case is applicable when UE is connected over an NSA network. This section tests ViLTE user
13 experience on an O-RAN system.

14 6.6.1.2 Test Setup

15 The SUT in this test case would be a O-eNB which would the Master eNB and a Secondary gNB. As most of the
16 current NSA deployment use the 4G eNB to provide video calling services, the use of a secondary gNB is optional. We
17 will however include the Secondary gNB in this test scenario, but it is only applicable if the gNB plays a role in
18 establishing the control plane or data plane for a video call. The O-eNB, gNB and the components within these have to
19 comply with the O-RAN specifications. The O-RAN setup should support the ability to perform this testing in different
20 radio conditions as defined in Section 3.6. A 4G core will be required to support the basic functionality to authenticate
21 and register an end user device in order to setup a PDN connection. An IMS core will be required to register the end
22 user device to support video calling services on a 4G network. The 4G and IMS cores could be a completely emulated,
23 partially emulated or real non-emulated cores. We will need at least two end user devices (UE) which can be a real UEs
24 or an emulated one, and both have to support video calling service using ViLTE. The end user devices will serve as
25 Mobile Originated (MO) and Mobile Terminated (MT) end user devices forming the two ends of the video call Going
26 forward in this section, these end user devices will be referred to as MO end user device and MT end user devices to
27 represent the role they plan in the video call. The test setup should include tools which have the ability to collect traces
28 on the elements and/or packet captures of communication between the elements. This could be a built-in capability of
29 the emulated/non-emulated network elements or an external tool. Optionally, if some of the network elements are
30 located remotely either in a cloud or on the internet, the additional latency should be calculated and accounted for.

31 The Master eNB, Secondary gNB and their components (O-eNB, O-RU, O-DU and O-CU) need to have the right
32 configuration and software load. The end user device must be configured with the right user credentials to be able to
33 register and authenticate with the O-RAN system and the 4G core. The end user devices also need to be provisioned
34 with the user credentials to attach to the 4G core and register with the IMS core to perform video call using ViLTE. The
35 4G core network and IMS core must be configured to support end user devices used for testing. This includes
36 supporting registration, authentication and PDN connection establishment for these end user devices. The also includes
37 provisioning the IMS core to support registration of the end user devices to make video calls over ViLTE. The locations
38 where the radio conditions are excellent, good, fair and poor need to be identified within the serving cell.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 150
O-RAN.TIFG.E2E-Test.0-v04.00

1 All the elements in the network like O-RAN system, 4G core and the IMS Core need to have the ability to capture
2 traces to validate the successful execution of the test cases. The end user devices need to have the capability to capture
3 traces/packet capture to calculate the VoLTE KPIs. Optionally, the network could have network taps deployed in
4 various legs of the network to get packet captures to validate successful execution of the test cases. Finally, all these
5 different components need to have connectivity with each other – the end user device should be able to connect to O-
6 RAN system(O-eNBs and gNBs), O-RAN system needs to be connected to the 4G core which in turn should have
7 connectivity to the IMS Core.

8 6.6.1.3 Test Methodology/Procedure

9 Ensure the end user devices, O-RAN system, 4G core and the IMS Core have all been configured as outlined in Section
10 6.6.1.2. In this test scenario, both the mobile originated and mobile terminated end user devices are going to use the
11 same O-RAN (i.e. same O-eNB), 4G and IMS core to perform the end-to-end video call. All traces and packet captures
12 need to be enabled for the duration of the testing to ensure all communication between network elements can be
13 captured and validated.

14 1. Power on the two end user devices in excellent radio condition and ensure both of the end user devices connect to
15 the 4G core over the Master O-eNB and optionally secondary gNB.
16 2. Ensure both the MO and MT end user devices can establish a PDN connection with the 4G core. Once the PDN
17 connection has been setup, both the end user devices have to register with the IMS core to support video calling
18 services.
19 3. Use the MO end user device to video call the MT end user device. Validate the MT end user device is able to
20 receive and answer the call.
21 4. Continue to have two-way voice and video communication on the video call for at least 5 minutes before
22 terminating it.
23 5. Repeat the test multiple times (>10 times) and gather results.
24 6. Repeat the above steps 1 through 5 for the MO and MT end user devices in good, fair and poor radio conditions.
25

26 6.6.1.4 Test Expectation (expected results)

27 As a pre-validation, use the traces to validate a successful PDN connection setup by the end user devices without any
28 errors using the Master eNB and optionally the secondary gNB. Also validate both the end user devices are able to
29 register with the IMS core for voice services. This is a prerequisite before these tests can be validated.

30 Validate the end user devices are able to make a video call between each other by dynamically setting up a QCI-2
31 bearer to transfer voice and video packets over RTP. Ensure the Call Setup Time is reasonable, the video and voice
32 quality from both parties are clear and audible without one-way or intermittent muting or video freezing. Use the packet
33 captures to validate there is no RTP packet drops or high RTP packet jitter which could cause voice muting, video
34 freezing or video lag issues. Use the packet captures to ensure there are no out-of-sequence packets which could impact
35 customer’s video calling experience.

36 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
37 is performed in a controlled environment in good radio condition without the interference of external factors which
38 could impact the KPIs, example: use of internet to connect to remote network nodes could add latency, jitter and packet
39 loss issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the testing in
40 this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results outside
41 the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues may be
42 due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and the test
43 has to be repeated.

44 • CSSR – Call Setup Success Rate % –> 99%.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 151
O-RAN.TIFG.E2E-Test.0-v04.00

1 • CST – Call Setup Time – < 2.5s


2 • MOS Score – > 3.5
3 • Mute Rate % – < 1%
4 • One Way Call % - < 1%
5 • RTP Packet Loss % - < 1%
6

7 These end user video calling KPI values included in Section 6.6 need to be included in the test report along with the
8 minimum configuration parameters included in Section 3.3. The following information should also be included in the
9 test report for the testing performed in different radio conditions to provide a comprehensive view of the test setup.

10 End user device side (real or emulated UE):

11 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

12 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

13 • Received downlink throughput (L1 and L3 PDCP layers) (average sample per second)

14 • Downlink transmission mode

15 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
16 (average sample per second)

17 The table below gives an example of the test report considering the mean and standard deviation of all test results that
18 have been captured.

19 Table 6-22 Example Test Report for Video over LTE Testing – Stationary Test

Excellent Poor
Good Fair
(cell centre) (cell edge)
ViLTE MO/MT ViLTE MO/MT ViLTE MO/MT ViLTE MO/MT

Call Setup Success Rate


Call Setup time
MOS Score
Mute Rate
One Way Call
RTP Packet Loss
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 152
O-RAN.TIFG.E2E-Test.0-v04.00

PDSCH BLER [%]


1 The end user ViLTE experience can also be impacted by some of the features available (see Section 6.3) on the O-eNB,
2 O-RU, O-DU and O-CU. The details of these features, along with any other features/functionality which could impact
3 the end user’s video calling service experience should be included in the test report to provide a comprehensive view of
4 the testing.

5 6.6.2 ViLTE Handover Test

6 This test section is for FFS.

7 6.6.3 ViLTE - LTE to NR handover test

8 This test scenario validates the user’s video calling experience when the UE is in video call on LTE and performs a
9 handover from LTE network to a 5G network and vice versa. This test scenario is applicable for a 5G SA deployment.

10 6.6.3.1 Test Description

11 Voice service is one of the basic services provided on the telecommunication network. Video service on the 4G network
12 was provided using ViLTE. Similarly, video service on the 5G network is provided using packet switch technology
13 called Video over New Radio. As 5G network is being deployed by a telecommunication service provider, the service
14 provider will need to support 4G and 5G network, and thus support Video over LTE, Video over NR and the handover
15 between the two video calling services. This scenario tests the end user’s video calling experience when the end user
16 device performs a handover from video over LTE to Video over NR and vice versa.

17 6.6.3.2 Test Setup

18 The SUT in this test case would be an O-eNB along with a gNB (O-RU, O-DU and O-CU). We would also need an
19 interworking 4G-5G core (referred to as 4G-5G core going forward) which supports a combined anchor point for 4G
20 and 5G, i.e. SMF+PGW-C and UPF+PGW-U. The eNB connects to a 4G-5G core over the 4G interfaces like S1 to
21 provide 4G LTE service and the O-CU, O-DU and O-RU would connect to a 4G-5G core over the 5G interfaces like N2
22 and N3 to provide 5G SA service. The O-eNB, gNB and the components within these have to comply with the O-RAN
23 specifications. The 4G-5G core will be required to support the basic functionality to authenticate and register an end
24 user device in order to setup a PDN connection/PDU session. An IMS core will be required to register the end user
25 device to support video calling services on a 4G and 5G network. The 4G and 5G core have to interwork using the N26
26 interface between the MME and AMF to support seamless handover. We are recommending the use of N26 interface to
27 ensure better customer experience when the end user device performs a handover between 4G and 5G. The 4G-5G and
28 IMS cores could be a completely emulated, partially emulated or real non-emulated cores. We will need at least two end
29 user devices (UE) which can be a real UEs or an emulated one, and both have to support video calling service using
30 ViLTE, Video over NR and the capability to handover from ViLTE to Video over NR and vice versa. The end user
31 devices will serve as Mobile Originated (MO) and Mobile Terminated (MT) end user devices forming the two ends of
32 the video call. Going forward in this section, these end user devices will be referred to as MO end user device and MT
33 end user devices to represent their role in the video call. The test setup should include tools which have the ability to
34 collect traces on the elements and/or packet captures of communication between the elements. This could be a built-in
35 capability of the emulated/non-emulated network elements or an external tool. Optionally, if some of the network
36 elements are located remotely either in a cloud or on the internet, the additional latency should be calculated and
37 accounted for.

38 The O-eNB, gNB and their components (O-RU, O-DU and O-CU) need to have the right configuration and software
39 load. The end user devices must be configured with the right user credentials to be able to register and authenticate with
40 the O-RAN system and the 4G-5G core. The end user devices also need to be provisioned with the user credentials to

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 153
O-RAN.TIFG.E2E-Test.0-v04.00

1 support attach procedure to the 4G-5G core, registering to the IMS core and performing a video call over LTE and NR.
2 This includes supporting registration, authentication and PDN connection/PDU Session establishment for these end user
3 devices. This also includes provisioning the IMS core to support registration of the end user devices to make video calls
4 over ViLTE and Video over NR, which includes dynamically setting up dedicated bearers/QoS Flows for video calls.

5 All the elements in the network like O-RAN system, 4G-5G core and the IMS Core need to have the ability to capture
6 traces to validate the successful execution of the test cases. The end user devices need to have the capability to capture
7 traces/packet capture to calculate the ViLTE and Video over NR KPIs. Optionally, the network could have network taps
8 deployed in various legs of the network to get packet captures to validate successful execution of the test cases. Finally,
9 all these different components need to have connectivity with each other – the end user device should be able to connect
10 to O-RAN system(O-eNBs and gNBs), O-RAN system needs to be connected to the 4G-5G core which in turn should
11 have connectivity to the IMS Core.

12 6.6.3.3 Test Methodology/Procedure

13 Ensure the end user devices, O-RAN system, 4G-5G core and the IMS Core have all been configured as outlined in
14 Section 6.6.3.2. In this test scenario, both the mobile originated and mobile terminated end user devices are going to use
15 the O-eNB and gNB to connect to the same 4G-5G and IMS core to perform the end-to-end video call. All traces and
16 packet captures need to be enabled for the duration of the testing to ensure all communication between network
17 elements can be captured and validated.

18 The below section gives the steps to perform a video over LTE to video over NR handover, followed by video over NR
19 to video over LTE handover

20 1. Power on the two end user devices and ensure both of the end user devices can connect to the 4G-5G core over the
21 O-eNB.
22 2. Ensure both the MO and MT end user devices can establish a PDN connection with the 4G-5G core. Once the PDN
23 connection has been established, both the end user devices have to register with the IMS core to support video
24 calling services.
25 3. Use the MO end user device to video call the MT end user device. Validate the MT end user device is able to
26 receive and answer the video call.
27 4. Once the two-way call has been setup and communication is going back and forth between the two end user
28 devices, move the MO device to the O-RU coverage of the gNB, thus forcing a handover from 4G to 5G, hence
29 forcing a ViLTE to Video over NR handover. Continue two-way voice and video communication through the
30 handover process and terminate the call once the handover process is complete. Measure the video KPIs included in
31 Section 6.6.
32 5. At this point in time, the MO end user device is in 5G coverage and registered to the 4G-5G core. The MT end user
33 device is in 4G coverage and registered to the 4G-5G core. Both end user devices should still be registered to the
34 IMS core.
35 6. Use the MO end user device to video call the MT end user device. Validate the MT end user device is able to
36 receive and answer the call.
37 7. Once the two-way call has been setup and communication is going back and forth between the two end user
38 devices, move the MO device to the O-eNB coverage, thus forcing a handover from 5G to 4G, hence forcing a
39 Video over NR to ViLTE handover. Continue the two-way voice and video communication through the handover
40 process and terminate the call once the handover process is complete. Measure the video KPIs included in Section
41 6.6.
42 8. Repeat the test multiple times (> 10 times) and gather results.
43

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 154
O-RAN.TIFG.E2E-Test.0-v04.00

1 6.6.3.4 Test Expectation (expected results)

2 As a pre-validation, use the traces to validate a successful registration and PDN connection setup by the end user
3 devices without any errors using the O-eNB when in 4G coverage. Similarly, use traces to validate successful
4 registration and PDU session setup by the end user device without any errors using O-RU, O-DU and O-CU when in 5G
5 coverage. Also validate both the end user devices are able to register with the IMS core for voice services. This is a
6 prerequisite before these tests can be validated.

7 Validate the end user devices are able to make a video call between each other by dynamically setting up a QCI-2/5QI-2
8 bearer to transfer voice and video packets over RTP. Ensure the Call Setup Time is reasonable, the video and voice
9 quality from both parties are clear and audible without one-way or intermittent muting or video freezing. Ensure the
10 voice and video quality is not impacted during the handover process – ViLTE to Video over NR and Video over NR to
11 ViLTE. Use the packet captures to validate there is no RTP packet drops or high RTP packet jitter which could cause
12 voice muting, video freezing or video lag issues. Use the packet captures to ensure there are no out-of-sequence packets
13 which could impact customer’s video calling experience.

14 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
15 is performed in a controlled environment in good radio condition without the interference of external factors which
16 could impact the KPIs, example: use of internet to connect to remote network nodes could add latency, jitter and packet
17 loss issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the testing in
18 this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results outside
19 the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues may be
20 due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and the test
21 has to be repeated.

22 • CSSR – Call Setup Success Rate % –> 99%.


23 • CST – Call Setup Time – < 2.5s
24 • MOS Score – > 3.5
25 • Mute Rate % – < 1%
26 • One Way Call % - < 1%
27 • RTP Packet Loss % - < 1%
28
29 These end user video calling KPI values included in Section 6.6 need to be included in the test report along with the
30 minimum configuration parameters included in Section 3.3. The following information should also be included in the
31 test report for the testing performed in different radio conditions to provide a comprehensive view of the test setup.

32 End user device side (real or emulated UE):

33 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

34 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

35 • Downlink transmission mode

36 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
37 (average sample per second)

38 The table below gives an example of the test report considering the mean and standard deviation of all test results that
39 have been captured.

40 Table 6-23 Example Test Report for Video over LTE Testing – Stationary Test

Excellent Poor
Good Fair
(cell centre) (cell edge)
ViLTE MO/MT ViLTE MO/MT ViLTE MO/MT ViLTE MO/MT

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 155
O-RAN.TIFG.E2E-Test.0-v04.00

Call Setup Success Rate


Call Setup time
MOS Score
Mute Rate
One Way Call
RTP Packet Loss
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
1 The end user ViLTE experience can also be impacted by some of the features available (see Section 6.3) on the O-eNB,
2 O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other features/functionality which
3 could impact the end user’s video calling service experience should be included in the test report to provide a
4 comprehensive view of the testing.

5 6.7 Video Service – EPS Fallback


6 Video service is quickly replacing voice service and turning into one of the basic services offered by telecommunication
7 service providers. Upgrade of telecom network occurs in phases, and this is no different for 5G. The telecom network
8 may not be able to support video calling service on 5G during this phase for multiple reasons – the 5G network may not
9 be deployed nationwide, or 5G network may not be tuned to support voice/video service or the devices may not be able
10 to support video/voice service on 5G. EPS fallback is the method used to support voice/video services during this
11 transition phase, where voice/video services are continued to be supported on the legacy LTE network using video over
12 LTE (ViLTE). This section of the document only applies to the video calling service provided by the telecom service
13 provider, which uses the telecom operator’s IMS core to establish dedicated bearer to provide superior video calling
14 experience. The KPIs which will be monitored to assess the video calling service are included below

15 • CSSR – Call Setup Success Rate % – Total number of calls which were successful by the total number of
16 calls made as a percentage.
17 • CST – Call Setup Time – Time taken from the initial SIP INVITE to when the SIP 180 Ringing is received
18 in seconds.
19 • MOS Score – Mean Opinion Score for the video call. This needs to be measured on both ends of the video
20 call – mobile originated and mobile terminated.
21 • Mute Rate % – This is the percentage of video calls which were muted or video freezes in both directions
22 (calls with RTP loss of > 3-4s in both directions are considered muted call). This needs to be measured on
23 both ends of the video call – mobile originated and mobile terminated and counted only once per call.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 156
O-RAN.TIFG.E2E-Test.0-v04.00

1 • One Way Calls % – This is the percentage of video calls which were muted, or video is not transmitted in
2 any one direction only (video calls with RTP loss of > 3-4s in one direction only are considered one-way
3 calls). This needs to be monitored on both ends of the Voice call – mobile originated and mobile terminated
4 and counted only once per call.
5 • RTP Packet Loss % - Number of RTP packets which were dropped/lost in uplink/downlink direction as a
6 percentage of total packets. This needs to be measured on both ends of the Voice call – mobile originated
7 and mobile terminated.
8

9 Along with the monitoring and validation of these services using user experience KPIs, the O-RAN systems also need
10 to be monitored. The end user service experience can also be impacted by some of the features available on the O-RAN
11 system. Some of these features have been included below. As a part of the voice services testing, details of these
12 features need to be included in the test report to get a comprehensive view of the setup used for testing. If additional
13 features or functionalities have been enabled during this testing that impact the end user voice experience, those need be
14 included in the test report as well.

15 • EPS Fallback for IMS Voice


16 • NR to EPS Mobility
17 • RLC in Unacknowledged Mode
18 • Robust Header Compression
19 • DRX
20 • Dynamic GBR Admission Control
21 • TTI Bundling
22 • VoLTE Inactivity Timer
23 • Frequency Hopping
24 • Multi-Target RRC Connection Re-Establishment
25 • VoLTE HARQ
26 • Coordinated Multi Point (DL and UL)
27 • VoLTE Quality Enhancement
28 • Packet Loss Detection
29 • Voice Codec Aware scheduler
30 Please see below summary table of test cases and applicable technology.
31
32 Table 6-24: Video EPS Fallback Test Case summary

Applicable technology
NSA
LTE

SA

Test case Functional group

Test ID Video Service-EPS Fallback


6.7.1 EPS Fallback Test Service Y N/A Y
33
34

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 157
O-RAN.TIFG.E2E-Test.0-v04.00

1 6.7.1 Video Service – EPS Fallback testing

2 This scenario tests the video calling service when an end user device (UE) is in 5G SA coverage and performs EPS
3 fallback to 4G to make or receive a video call.

4 6.7.1.1 Test Description

5 This section tests the video service on a 5G SA network when it uses EPS fallback mechanism to fallback to the LTE
6 network to use video over LTE to support video calling service. This testing only applies to 5G SA deployment and in
7 this scenario the UE is connected to the 5G SA Core and registered with the IMS core for video calling service. As
8 video calling service is not supported on 5G yet, the UE is forced to fallback to 4G when it makes/receives a video call.
9 The EPS fallback does increase the Call Setup Time due to the time needed to fallback before making/receiving the
10 video call.

11 There are two mechanism in which EPS fallback is supported on the 5G core, and the methodology used impacts the
12 time taken to perform the fallback to LTE, and hence impacts the Call Setup Time. The two mechanisms that could be
13 used to perform EPS fallback is included below. These two mechanisms do change the test setup but does not change
14 the testing procedure.

15 • With N26 interface – The AMF in the 5G core communicates with the MME in the 4G over the N26 interface.
16 In this scenario the UE performs a handover from the 5G network to the 4G network.
17 • Without N26 interface – The AMF in the 5G core does not communicate directly with the 4G core, instead
18 uses the UDM/HSS to store and transfer relevant session information to the 4G core. In this scenario the UE
19 performs a Release with Redirect from the 5G network to the 4G network.
20

21 6.7.1.2 Test Setup

22 The SUT in this test case would be an O-eNB along with a gNB (O-RU, O-DU and O-CU-CP/O-CU-UP). We would
23 also need an interworking 4G-5G core (referred to as 4G-5G core going forward) which supports a combined anchor
24 point, i.e. SMF+PGW-C and UPF+PGW-U. The O-eNB connects to a 4G-5G core over the 4G interfaces like S1 to
25 provide 4G LTE service and the gNB (O-RU, O-DU and O-CU-CP/O-CU-UP) would connect to the 4G-5G core over
26 the 5G interfaces like N2 and N3 to provide 5G SA service. The O-eNB, gNB and the components within these have to
27 comply with the O-RAN specifications and support EPS fallback. The 4G-5G core will be required to support the basic
28 functionality to authenticate and register an end user device in order to setup a PDN connection/PDU session. The IMS
29 core will be required to register the end user device and needs to be integrated with the 4G-5G core to support video
30 calling services over LTE (ViLTE) and EPS fallback. The 4G-5G core has to interwork either using the N26 interface
31 or without using N26 interface depending on the desired core network configuration. The existence of the N26 interface
32 does reduce the EPS fallback time, hence reduce Call Setup Time and provide better end user experience. The 4G-5G
33 and IMS cores could be a completely emulated, partially emulated or real non-emulated cores. We will need at least two
34 end user devices (UE) which can be a real UEs or an emulated one, and both have to support video calling service using
35 ViLTE and EPS fallback procedure. The end user devices will serve as Mobile Originated (MO) and Mobile
36 Terminated (MT) end user devices forming the two ends of the video call. For the sake of clarity, we will address these
37 end user devices as UE-1 and UE-2 in this section. The test setup should include tools which have the ability to collect
38 traces on the elements and/or packet captures of communication between the elements. This could be a built-in
39 capability of the emulated/non-emulated network elements or an external tool. Optionally, if some of the network
40 elements are located remotely either in a cloud or on the internet, the additional latency should be calculated and
41 accounted for.

42 The O-eNB, gNB and their components (O-RU, O-DU and O-CU-CP/O-CU-UP) need to have the right configuration
43 and software load. The end user device must be configured with the right user credentials to be able to register and

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 158
O-RAN.TIFG.E2E-Test.0-v04.00

1 authenticate with the O-RAN system and the 4G-5G core. The end user devices also need to be provisioned with the
2 user credentials to attach to the 4G-5G core and register with the IMS core to perform video call using ViLTE and EPS
3 fallback. The 4G-5G core network and IMS core must be configured to support video calling service on the end user
4 device used for testing, which includes dynamically setting up dedicated bearers for voice calls.

5 All the elements in the network like O-RAN system, 4G-5G core and the IMS Core need to have the ability to capture
6 traces to validate the successful execution of the test cases. The end user devices need to have the capability to capture
7 traces/packet capture to calculate the ViLTE KPIs. Optionally, the network could have network taps deployed in
8 various legs of the network to get packet captures to validate successful execution of the test cases. Finally, all these
9 different components need to have connectivity with each other – the end user device should be able to connect to O-
10 RAN system(O-eNBs and gNBs), O-RAN system needs to be connected to the 4G-5G core which in turn should have
11 connectivity to the IMS Core.

12 6.7.1.3 Test Methodology/Procedure

13 Ensure the end user devices, O-RAN system, 4G-5G core and the IMS Core have all been configured as outlined in
14 Section 6.7.1.2. In this test scenario, one of the end user devices (UE-1) is going to be connected over the 4G O-eNB to
15 the 4G-5G core, while the other end user device (UE-2) is going to be connected over 5G gNB (O-RU, O-DU and O-
16 CU-CP/O-CU-UP) to the 4G-5G core. All traces and packet captures need to be enabled for the duration of the testing
17 to ensure all communication between network elements can be captured and validated.

18 1. Power on the two end user devices and ensure UE-1 is in LTE coverage, when UE-2 is in the 5G coverage.
19 Validate both end user devices register to the 4G-5G core.
20 2. Once the registration is complete, UE-1 and UE-2 have to establish a PDN connection and PDU session
21 respectively with the 4G-5G core. Once the PDN connection and PDU session have been setup, both the end user
22 devices have to register with the IMS core to support video calling services.
23 3. Use UE-1 as the MO end user device to video call UE-2. Validate the UE-2 performs EPS fallback to LTE to
24 receive the video call.
25 4. Answer the call on UE-2 and continue to have two-way audio/video communication for at least 5 minutes and
26 terminate the call. Measure the voice KPIs included in Section 6.7.
27 5. At this point UE-2 should have completed the video call and moved back to connect to the 4G-5G Core over 5G
28 gNB (i.e. O-RU).
29 6. Use UE-2 as the MO end user device to call UE-1. Validate the UE-2 performs EPS fallback to LTE to before
30 making the video call.
31 7. Answer the call on UE-1 and continue to have two-way communication for at least 5 minutes and terminate the
32 call. Measure the voice KPIs included in Section 6.7.
33 8. Repeat the test multiple times (>10 times) and gather results.
34

35 6.7.1.4 Test Expectation (expected results)

36 As a pre-validation, use the traces to validate end user device UE-1 is able to perform a successful registration and PDN
37 connection setup while using the O-eNB in LTE coverage. Similarly, use the traces to validate end user device UE-2 is
38 able to perform a successful registration and PDU session setup while using gNB in 5G coverage. Also validate both the
39 end user devices are able to register with the IMS core for voice services. This is a prerequisite before these tests can be
40 validated.

41 Validate the end user devices are able to make a video call between each other by dynamically setting up a QCI-2
42 bearer to transfer voice and video packets over RTP. Validate the device which is in the 5G coverage falls back to 4G
43 before receiving or making a video call, in other words the EPS fallback procedure has been executed successfully. The
44 Call Setup Time will be higher than Video over LTE due to the delay associated with executing the EPS fallback
45 procedure. Even though this test case does not directly impact the quality of the video call, for the sake of consistency,

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 159
O-RAN.TIFG.E2E-Test.0-v04.00

1 ensure the voice and video quality from both parties are clear and audible without one-way or intermittent muting and
2 video freezing. Use the packet captures to validate there is no RTP packet drops or high RTP packet jitter which could
3 cause voice muting, video freezing and video lag issues. Use the packet captures to ensure there are no out-of-sequence
4 packets which could impact customer’s video calling experience.

5 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
6 is performed in a controlled environment in good radio condition without the interference of external factors which
7 could impact the KPIs, example: use of internet to connect to remote network nodes could add latency, jitter and packet
8 loss issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the testing in
9 this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results outside
10 the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues may be
11 due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and the test
12 has to be repeated.

13 Table 6-25 Typical Video KPI values in controlled environments with good radio conditions

KPI With N26 Interface Without N26 Interface

CSSR – Call Setup Success Rate % >99% >99%

CST – Call Setup Time 3.5s 4s

MOS Score 3.5 3.5

Mute Rate % < 1% < 1%

One Way Call % < 1% < 1%

RTP Packet Loss % < 1% < 1%


14

15 These end user video calling KPI values included in Section 6.7 need to be included in the test report along with the
16 minimum configuration parameters included in Section 3.3. The following information should also be included in the
17 test report for the testing performed to provide a comprehensive view of the test setup.

18 End user device side (real or emulated UE):

19 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

20 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

21 • Received downlink throughput (L1 and L3 PDCP layers) (average sample per second)

22 • Downlink transmission mode

23 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
24 (average sample per second)

25 The table below gives an example of the test report considering the mean and standard deviation of all test results that
26 have been captured.

27 Table 6-26 Example Test Report for Video Calling Service Testing – EPS Fallback

EPS Fallback MO EPS Fallback MT


Call Setup Success Rate
Call Setup time

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 160
O-RAN.TIFG.E2E-Test.0-v04.00

MOS Score
Mute Rate
One Way Call
RTP Packet Loss
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
1 The end user ViLTE experience can also be impacted by some of the features available (see Section 6.7) on the O-eNB,
2 O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other features/functionality which
3 could impact the end user’s video calling experience should be included in the test report to provide a comprehensive
4 view of the testing.

5 6.8 Video Service – Video over NR


6 Video calling service is slowly replacing voice as a basic calling service. The improvement in end user throughput on
7 LTE has facilitated the move from voice call to video, and the high speeds on 5G is only going to accelerate this
8 migration process. This section of the document validates the user’s video calling experience in different conditions on
9 the 5G network. This section of the document only applies to the video calling service provided by the telecom service
10 provider, which uses the telecom operator’s IMS core to establish dedicated QoS flows to provide superior video calling
11 experience. The KPIs which will be monitored to assess the video calling service are included below

12 • CSSR – Call Setup Success Rate % – Total number of calls which were successful by the total number of
13 calls made as a percentage.
14 • CST – Call Setup Time – Time taken from the initial SIP INVITE to when the SIP 180 Ringing is received
15 in seconds.
16 • MOS Score – Mean Opinion Score for the video call. This needs to be measured on both ends of the video
17 call – mobile originated and mobile terminated.
18 • Mute Rate % – This is the percentage of video calls which were muted or video freezes in both directions
19 (calls with RTP loss of > 3-4s in both directions are considered muted call). This needs to be measured on
20 both ends of the video call – mobile originated and mobile terminated and counted only once per call.
21 • One Way Calls % – This is the percentage of video calls which were muted, or video is not transmitted in
22 any one direction only (video calls with RTP loss of > 3-4s in one direction only are considered one-way
23 calls). This needs to be monitored on both ends of the Voice call – mobile originated and mobile terminated
24 and counted only once per call.
25 • RTP Packet Loss % - Number of RTP packets which were dropped/lost in uplink/downlink direction as a
26 percentage of total packets. This needs to be measured on both ends of the Voice call – mobile originated
27 and mobile terminated.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 161
O-RAN.TIFG.E2E-Test.0-v04.00

2 End user video calling service experience can also be impacted by some of the features available on the O-RU, O-DU
3 and O-CU-CP/O-CU-UP. Some of these features have been included below. As a part of the video calling services
4 testing, details of these features need to be included in the test report to get a comprehensive view of the setup used for
5 testing. If additional features or functionalities have been enabled during this testing that impact the end user voice
6 experience, those need be included in the test report as well.

7 • Basic Voice over NR


8 • RLC in Unacknowledged Mode
9 • Robust Header Compression
10 • DRX
11 • Dynamic GBR Admission Control
12 • TTI Bundling
13 • Automated Neighbor Relations ANR
14 • Connected mode mobility (Handover)
15 • idle mode reselection
16 • Intra-frequency Cell Reselection/Handover
17 • Inter-frequency Redirection/Cell Reselection/Handover
18 • NR Coverage-Triggered NR Session Continuity
19

20 Please see below summary table of test cases and applicable technology.

21 Table 6-27: Video of NR Test Case summary

Applicable technology

NSA
LTE

SA
Test case Functional group

Test ID Video service- Video over NR


6.8.1 Video over NR Test – stationary Service N/A N/A Y
6.8.2. Video – Intra-Distributed Unit (DU) handover Service N/A N/A Y
Video – Intra-Central Unit (CU) Inter-Distributed Unit Service N/A N/A Y
6.8.3.
(DU) handover
6.8.4 Video – Inter-Central Unit (CU) handover Service N/A N/A Y
22
23

24 6.8.1 Video over NR – Stationary Testing

25 This test scenario validates the user’s video calling experience when the end user device (UE) makes a video call on the
26 5G network.

27 6.8.1.1 Test Description

28 This section of the document tests video over NR user experience on an O-RAN system. Video over NR is similar to
29 ViLTE where it uses IP packets to send and receive audio/video packets, with the IP packets being transferred over NR

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 162
O-RAN.TIFG.E2E-Test.0-v04.00

1 or 5G radio. Just like ViLTE, IMS is used to setup the control and data plane for the video communication. As video
2 calling service is latency sensitive, the 5G core interacts with the IMS core to setup different QoS flows for video traffic
3 – 5QI-5 for control plane and 5QI-2 for data plane. This test case is applicable when UE is connected over a 5G SA
4 network.

5 6.8.1.2 Test Setup

6 The SUT in this test case would be a gNB (O-RU, O-DU and O-CU-CP/O-CU-UP) which is used to test the video
7 calling service. A 5G SA Core would be required to support basic functionality to authenticate and register the end user
8 device to establish a PDU session. An IMS core will be required to register the end user device to support video calling
9 services on a 5G network. The 5G and IMS cores could be a completely emulated, partially emulated or real non-
10 emulated cores. We will need at least two end user devices (UE) which can be a real UEs or an emulated one, and both
11 have to support video calling service using Video over NR. The end user devices will serve as Mobile Originated (MO)
12 and Mobile Terminated (MT) end user devices forming the two ends of the video call. Going forward in this section,
13 these end user devices will be referred to as MO end user device and MT end user device to represent their role in the
14 video call. The test setup should include tools which have the ability to collect traces on the elements and/or packet
15 captures of communication between the elements. This could be a built-in capability of the emulated/non-emulated
16 network elements or an external tool. Optionally, if some of the network elements are located remotely either in a cloud
17 or on the internet, the additional latency should be calculated and accounted for.

18 The O-RU, O-DU and O-CU-CP/O-CU-UP need to have the right configuration and software load. The SUT will also
19 need to be setup to run this testing in different radio conditions as outlined in Section 3.6. The end user device must be
20 configured with the right user credentials to be able to register and authenticate with the O-RAN system and the 5G
21 core. The end user devices also need to be provisioned with the user credentials to register and setup PDU session with
22 the 5G core and register with the IMS core to perform video call using Video over NR. The 5G core network and IMS
23 core must be configured to support video calling service on the end user devices used for testing, which includes the
24 ability to dynamically set up QoS Flows for video calls. The locations where the radio conditions are excellent, good,
25 fair and poor need to be identified within the serving cell.

26 All the elements in the network like O-RAN system, 5G core and the IMS Core need to have the ability to capture
27 traces to validate the successful execution of the test cases. The end user devices need to have the capability to capture
28 traces/packet capture to calculate the Video over NR KPIs. Optionally, the network could have network taps deployed
29 in various legs of the network to get packet captures to validate successful execution of the test cases. Finally, all these
30 different components need to have connectivity with each other – the end user device should be able to connect to O-
31 RAN system(O-RU, O-DU and O-CU-CP/O-CU-UP), O-RAN system needs to be connected to the 5G core which in
32 turn should have connectivity to the IMS Core.

33 6.8.1.3 Test Methodology/Procedure

34 Ensure the end user devices, O-RAN system, 5G core and the IMS core have all been configured as outlined in Section
35 6.8.1.2. In this test scenario, both the mobile originated and mobile terminated end user devices are going to use the
36 same O-RAN system, 5G and IMS core to perform the end-to-end video call. All traces and packet captures need to be
37 enabled for the duration of the testing to ensure all communication between network elements can be captured and
38 validated.

39 1. Power on the two end user devices in excellent radio condition and ensure both of devices registers with the 5G
40 core for video calling services over SA by connecting over the O-RAN (O-RU, O-DU and O-CU-CP/O-CU-UP).
41 2. Once the registration is complete, the MO and MT end user devices have to establish a PDU session with the 5G
42 core. Once the PDU session has been setup, both the end user devices have to register with the IMS core to support
43 video calling services.
44 3. Use the MO end user device to video call the MT end user device. Validate the MT end user device is able to
45 receive and answer the call.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 163
O-RAN.TIFG.E2E-Test.0-v04.00

1 4. Continue to have two-way audio/video communication on the video call for at least 5 minutes before terminating it.
2 5. Repeat the test multiple times (> 10 times) and gather results.
3 6. Repeat the above steps 1 through 5 for the good, fair and poor radio conditions.
4

5 6.8.1.4 Test Expectation (expected results)

6 As a pre-validation, use the traces to validate a successful registration and PDU session setup by the end user devices
7 without any errors over O-RU, O-DU and O-CU-CP/O-CU-UP. Also validate both the end user devices are able to
8 register with the IMS core for video calling services. This is a prerequisite before these tests can be validated.

9 Validate the end user devices are able to make a video call between each other by dynamically setting up a 5QI-2 bearer
10 to transfer voice and video packets over RTP. Ensure the Call Setup Time is reasonable, the voice and video quality
11 from both parties are clear and audible without one-way or intermittent muting and video freezing. Use the packet
12 captures to validate there is no RTP packet drops or high RTP packet jitter which could cause voice muting, video
13 freezing or video lag issues. Use the packet captures to ensure there are no out-of-sequence packets which could impact
14 customer’s voice experience.

15 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
16 is performed in a controlled environment in good radio condition without the interference of external factors which
17 could impact the KPIs, example: use of internet to connect to remote servers/hosts could add latency, jitter and
18 reliability issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the
19 testing in this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results
20 outside the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues
21 may be due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and
22 the test has to be repeated.

23 • CSSR – Call Setup Success Rate % –> 99%.


24 • CST – Call Setup Time – < 2.5s
25 • MOS Score – > 3.5
26 • Mute Rate % – < 1%
27 • One Way Call % - < 1%
28 • RTP Packet Loss % - < 1%
29

30 As a part of gathering data, ensure the minimum configuration parameters (see Section 3.3) are included in the test
31 report. The following information should also be included in the test report to provide a comprehensive view of the test
32 setup.

33 UE side (real or emulated UE):

34 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

35 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

36 • Downlink transmission mode

37 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
38 (average sample per second)

39 Table 6-28 Example Test Report for Video over NR – Stationary Testing

Excellent Poor
Good Fair
(cell centre) (cell edge)
Video over NR Video over NR Video over NR Video over NR
MO/MT MO/MT MO/MT MO/MT
_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 164
O-RAN.TIFG.E2E-Test.0-v04.00

Call Setup Success Rate


Call Setup Time
MOS Score
Mute Rate %
One Way Call %
RTP Packet Loss %
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
1 The end user video calling service experience can also be impacted by some of the features available (see Section 6.8)
2 on the O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other features/functionality
3 which could impact the end user’s video calling service experience should be included in the test report to provide a
4 comprehensive view of the testing.

5 6.8.2 Video over NR – Intra-Distributed Unit (DU) handover

6 This test scenario validates the user’s video calling experience when the end user device (UE) is on a video call over
7 NR and performs a handover between O-RUs which connect to the same O-DU (Intra-O-DU handover).

8 6.8.2.1 Test Description

9 The 5G O-RAN system has multiple sub-components like the O-RU, O-DU and the O-CU-CP/O-CU-UP. This setup
10 leads to multiple handover scenarios between the O-RAN sub-components. This particular scenario tests the end user’s
11 video calling experience during the handover between two O-RUs which are connected to the same O-DU (and O-CU-
12 CP/O-CU-UP), hence an Intra-O-DU handover. This handover will be agnostic to the 5G core as the handover occurs
13 on the O-RAN system. This test assesses the impact on the end user’s voice service in this handover scenario by
14 monitoring the KPIs included in Section 6.8.

15 6.8.2.2 Test Setup

16 The SUT in this test case would be a pair of O-RUs which connect to the same O-DU and O-CU-CP/O-CU-UP. This O-
17 RAN setup is used to test the video calling service during a handover (refer Section 4.4). A 5G SA Core would be
18 required to support basic functionality to authenticate and register the end user device to establish a PDU session. An
19 IMS core will be required to register the end user device to support video calling services on a 5G network. The 5G and
20 IMS cores could be a completely emulated, partially emulated or real non-emulated cores. We will need at least two end

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 165
O-RAN.TIFG.E2E-Test.0-v04.00

1 user devices (UE) which can be a real UEs or an emulated one, and both have to support video calling service using 5G
2 – Video over NR. The end user devices will serve as Mobile Originated (MO) and Mobile Terminated (MT) end user
3 devices forming the two ends of the video call. Going forward in this section, these end user devices will be referred to
4 as MO end user device and MT end user device to represent their role in the video call. The test setup should include
5 tools which have the ability to collect traces on the elements and/or packet captures of communication between the
6 elements. This could be a built-in capability of the emulated/non-emulated network elements or an external tool.
7 Optionally, if some of the network elements are located remotely either in a cloud or on the internet, the additional
8 latency should be calculated and accounted for.

9 The pair of O-RUs (O-RU1 and O-RU2) need to be connected to the O-DU and O-CU-CP/O-CU-UP and all the
10 components need to have the right configuration and software load. The end user devices must be configured with the
11 right user credentials to be able to register and authenticate with the O-RAN system and the 5G core. The end user
12 devices also need to be provisioned with the user credentials to register and setup PDU session with the 5G core and
13 register with the IMS core to perform video call using Video over NR. The 5G core network and IMS core must be
14 configured to support video service on the end user devices used for testing, which includes the ability to dynamically
15 set up QoS Flows for voice calls.

16 All the elements in the network like O-RAN system, 5G core and the IMS Core need to have the ability to capture
17 traces to validate the successful execution of the test cases. The end user devices need to have the capability to capture
18 traces/packet capture to calculate the Video over NR KPIs. Optionally, the network could have network taps deployed
19 in various legs of the network to get packet captures to validate successful execution of the test cases. Finally, all these
20 different components need to have connectivity with each other – the end user device should be able to connect to O-
21 RAN system(O-RUs connected to O-DU and O-CU-CP/O-CU-UP), O-RAN system needs to be connected to the 5G
22 core which in turn should have connectivity to the IMS Core.

23 6.8.2.3 Test Methodology/Procedure

24 Ensure the end user devices, O-RAN system, 5G core and the IMS core have all been configured as outlined in Section
25 6.8.2.2. In this test scenario, both the mobile originated and mobile terminated end user devices are going to use the
26 same O-RAN system, 5G and IMS core to perform the end-to-end video call. All traces and packet captures need to be
27 enabled for the duration of the testing to ensure all communication between network elements can be captured and
28 validated.

29 1. Power on the two end user devices and ensure both of the devices registers with the 5G core for video calling
30 services over SA by connecting over the O-RAN (O-RU, O-DU and O-CU-CP/O-CU-UP). Ensure both the MO &
31 MT end user devices are in the coverage area of the same O-RU – O-RU1.
32 2. Once the registration is complete, the MO and MT end user devices have to establish PDU session with the 5G
33 core. Once the PDU session has been setup, both the end user devices have to register with the IMS core to support
34 voice services.
35 3. Use the MO end user device to video call the MT end user device. Validate the MT end user device is able to
36 receive and answer the call.
37 4. Once the call has been setup, move the MO end user device from the coverage area of O-RU1 to coverage area of
38 O-RU2, thus causing a handover from O-RU1 to O-RU2.
39 5. Continue the two-way video and voice communication between MO and MT end user devices until the handover
40 procedure is complete before terminating the video call.
41 6. Repeat the test (steps 1 through 5) multiple times (> 10 times) and gather results.
42

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 166
O-RAN.TIFG.E2E-Test.0-v04.00

1 6.8.2.4 Test Expectation (expected results)

2 As a pre-validation, use the traces to validate a successful registration and PDU session setup by the end user devices
3 without any errors over O-RU, O-DU and O-CU-CP/O-CU-UP. Also validate both the end user devices are able to
4 register with the IMS core for video calling services. This is a prerequisite before these tests can be validated.

5 Validate the end user devices are able to make a video call between each other by dynamically setting up a 5QI-2 bearer
6 to transfer voice and video packets over RTP. Ensure the Call Setup Time is reasonable, the voice and video
7 qualityfrom both parties are clear and audible without one-way or intermittent muting and video freezing, through the
8 duration of the call, especially during the handover process. Use the packet captures to validate there is no RTP packet
9 drops or high RTP packet jitter which could cause voice muting, video freezing or video lag issues. Use the packet
10 captures to ensure there are no out-of-sequence packets which could impact customer’s voice experience.

11 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
12 is performed in a controlled environment in good radio condition without the interference of external factors which
13 could impact the KPIs, example: use of internet to connect to remote servers/hosts could add latency, jitter and
14 reliability issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the
15 testing in this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results
16 outside the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues
17 may be due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and
18 the test has to be repeated.

19 • CSSR – Call Setup Success Rate % –> 99%.


20 • CST – Call Setup Time – < 2.5s
21 • MOS Score – > 3.5
22 • Mute Rate % – < 1%
23 • One Way Call % - < 1%
24 • RTP Packet Loss % - < 1%
25

26 As a part of gathering data, ensure the minimum configuration parameters (see Section 3.3) are included in the test
27 report. The following information should also be included in the test report to provide a comprehensive view of the test
28 setup.

29 UE side (real or emulated UE):

30 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

31 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

32 • Downlink transmission mode

33 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
34 (average sample per second)

35 Table 6-29 Example Test Report for Video over NR – Intra-O-DU Handover

Video over NR MO Video over NR MT

Call Setup Success Rate


Call Setup Time
MOS Score
Mute Rate %

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 167
O-RAN.TIFG.E2E-Test.0-v04.00

One Way Call %


RTP Packet Loss %
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
1 The end user video calling service experience can also be impacted by some of the features available (see Section 6.8)
2 on the O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other features/functionality
3 which could impact the end user’s video calling service experience should be included in the test report to provide a
4 comprehensive view of the testing.

5 6.8.3 Video over NR – Intra-Central Unit (CU) Inter-Distributed Unit (DU)


6 handover

7 This test scenario validates the user’s video calling experience when the end user device (UE) is on a video call and
8 performs a handover between O-RUs which connect to different O-DUs which in turn are connected to the same O-CU-
9 CP/O-CU-UP (Intra-O-CU Inter-O-DU handover)

10 6.8.3.1 Test Description

11 The 5G O-RAN system has multiple sub-components like the O-RU, O-DU and the O-CU-CP/O-CU-UP. This setup
12 leads to multiple handover scenarios between the O-RAN sub-components. This particular scenario tests the end user’s
13 video calling experience during the handover between two O-RUs, where the O-RUs are connected to different O-DUs
14 which in turn are connected to the same O-CU-CP/O-CU-UP, hence an Intra-O-CU Inter-O-DU handover. This
15 handover will be agnostic to the 5G core as the handover occurs on the O-RAN system. This test assesses the impact on
16 the end user’s voice service in this handover scenario by monitoring the KPIs included in Section 6.8.

17 6.8.3.2 Test Setup

18 The SUT in this test case would be a pair of O-RUs which connect to a different pair of O-DUs which in turn connect to
19 the same O-CU-CP/O-CU-UP (refer Section 4.5). This O-RAN setup is used to test the voice service during a handover.
20 A 5G SA Core would be required to support basic functionality to authenticate and register the end user device to
21 establish a PDU session. An IMS core will be required to register the end user device to support voice services on a 5G
22 network. The 5G and IMS cores could be a completely emulated, partially emulated or real non-emulated cores. We
23 will need at least two end user devices (UE) which can be a real UEs or an emulated one, and both have to support
24 video calling service using Video over NR. The end user devices will serve as Mobile Originated (MO) and Mobile

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 168
O-RAN.TIFG.E2E-Test.0-v04.00

1 Terminated (MT) end user devices forming the two ends of the video call. Going forward in this section, these end user
2 devices will be referred to as MO end user device and MT end user device to represent their role in the voice call. The
3 test setup should include tools which have the ability to collect traces on the elements and/or packet captures of
4 communication between the elements. This could be a built-in capability of the emulated/non-emulated network
5 elements or an external tool. Optionally, if some of the network elements are located remotely either in a cloud or on the
6 internet, the additional latency should be calculated and accounted for.

7 The pair of O-RUs (O-RU1 and O-RU2) need to be connected to a pair of O-DUs (O-DU1 and O-DU2) and O-CU-
8 CP/O-CU-UP. As for the O-RU to O-DU connection, ensure O-RU1 connects to O-DU1, and O-RU2 connects to O-
9 DU2, and both the O-DUs connect to the same O-CU-CP/O-CU-UP. All the O-RAN components need to have the right
10 configuration and software load. The end user devices also need to be provisioned with the user credentials to register
11 and setup PDU session with the 5G core and register with the IMS core to perform video call using Video over NR. The
12 5G core network and IMS core must be configured to support video calling service on the end user devices used for
13 testing, which includes the ability to dynamically set up QoS Flows for video calls.

14 All the elements in the network like O-RAN system, 5G core and the IMS Core need to have the ability to capture
15 traces to validate the successful execution of the test cases. The end user devices need to have the capability to capture
16 traces/packet capture to calculate the Video over NR KPIs. Optionally, the network could have network taps deployed
17 in various legs of the network to get packet captures to validate successful execution of the test cases. Finally, all these
18 different components need to have connectivity with each other – the end user device should be able to connect to O-
19 RAN system(O-RUs connected to O-DU and O-CU-CP/O-CU-UP), O-RAN system needs to be connected to the 5G
20 core which in turn should have connectivity to the IMS Core.

21 6.8.3.3 Test Methodology/Procedure

22 Ensure the end user devices, O-RAN system, 5G core and the IMS Core have all been configured as outlined in Section
23 6.8.3.2. In this test scenario, both the mobile originated and mobile terminated end user devices are going to use the
24 same O-RAN system, 5G and IMS core to perform the end-to-end video call. All traces and packet captures need to be
25 enabled for the duration of the testing to ensure all communication between network elements can be captured and
26 validated.

27 1. Power on the two end user devices and ensure both of devices registers with the 5G core for video calling services
28 over SA by connecting over the O-RAN (O-RU, O-DU and O-CU-CP/O-CU-UP). Ensure both the MO & MT end
29 user devices are in the coverage area of the same O-RU – O-RU1.
30 2. Once the registration is complete, the MO and MT end user devices have to establish a PDU session with the 5G
31 core. Once the PDU session has been setup, both the end user devices have to register with the IMS core to support
32 video calling services.
33 3. Use the MO end user device to video call the MT end user device. Validate the MT end user device is able to
34 receive and answer the call.
35 4. Once the video call has been setup, move the MO end user device from the coverage area of O-RU1 to coverage
36 area of O-RU2, thus causing a handover from O-RU1 to O-RU2, and O-DU1 to O-DU2.
37 5. Continue the two-way video and voice communication between the end user devices until the handover procedure
38 is complete before terminating the video call.
39 6. Repeat the test (steps 1 through 5) multiple times (> 10 times) and gather results.
40

41 6.8.3.4 Test Expectation (expected results)

42 As a pre-validation, use the traces to validate a successful registration and PDU session setup by the end user devices
43 without any errors over O-RU, O-DU and O-CU-CP/O-CU-UP. Also validate both the end user devices are able to
44 register with the IMS core for video calling services. This is a prerequisite before these tests can be validated.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 169
O-RAN.TIFG.E2E-Test.0-v04.00

1 Validate the end user devices are able to make a video call between each other by dynamically setting up a 5QI-2 bearer
2 to transfer voice and video packets over RTP. Ensure the Call Setup Time is reasonable, the voice and video quality
3 from both parties are clear and audible without one-way or intermittent muting and video freezing, through the duration
4 of the call, especially during the handover process. Use the packet captures to validate there is no RTP packet drops or
5 high RTP packet jitter which could cause voice muting, video freezing or video lag issues. Use the packet captures to
6 ensure there are no out-of-sequence packets which could impact customer’s voice experience.

7 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
8 is performed in a controlled environment in good radio condition without the interference of external factors which
9 could impact the KPIs, example: use of internet to connect to remote servers/hosts could add latency, jitter and
10 reliability issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the
11 testing in this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results
12 outside the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues
13 may be due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and
14 the test has to be repeated.

15 • CSSR – Call Setup Success Rate % –> 99%.


16 • CST – Call Setup Time – < 2.5s
17 • MOS Score – > 3.5
18 • Mute Rate % – < 1%
19 • One Way Call % - < 1%
20 • RTP Packet Loss % - < 1%
21

22 As a part of gathering data, ensure the minimum configuration parameters (see Section 3.3) are included in the test
23 report. The following information should also be included in the test report to provide a comprehensive view of the test
24 setup.

25 UE side (real or emulated UE):

26 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

27 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

28 • Downlink transmission mode

29 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
30 (average sample per second)

31 Table 6-30 Example Test Report for Video over NR – Intra-O-CU Inter-O-DU Handover

Video over NR MO Video over NR MT

Call Setup Success Rate


Call Setup Time
MOS Score
Mute Rate %
One Way Call %
RTP Packet Loss %
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 170
O-RAN.TIFG.E2E-Test.0-v04.00

MIMO rank
PDSCH MCS
DL RB number
UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
1 The end user video calling service experience can also be impacted by some of the features available (see Section 6.8)
2 on the O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other features/functionality
3 which could impact the end user’s video calling service experience should be included in the test report to provide a
4 comprehensive view of the testing.

5 6.8.4 Video over NR – Intra-Central Unit (CU) handover

6 This test scenario validates the user’s video calling experience when the end user device (UE) is on a video call and
7 performs a handover between O-RUs which connected to different O-DUs and O-CU-CP/O-CU-UPs (Inter-O-CU Inter-
8 O-DU handover)

9 6.8.4.1 Test Description

10 The 5G O-RAN system has multiple sub-components like the O-RU, O-DU and the O-CU-CP/O-CU-UP. This setup
11 leads to multiple handover scenarios between the O-RAN sub-components. This particular scenario tests the end user’s
12 video calling experience during the handover between two O-RUs, where the O-RUs are connected to different O-DUs
13 which in turn are connected to different O-CU-CP/O-CU-UPs, hence an Inter-O-CU Inter-O-DU handover. This
14 handover occurs on the O-RAN system and the 5G core is aware of the handover as it needs to send data to a new O-
15 CU-CP/O-CU-UP as a part of the handover process. This test assesses the impact on the end user’s voice service in this
16 handover scenario by monitoring the KPIs included in Section 6.8.

17 6.8.4.2 Test Setup

18 The SUT in this test case would be a pair of O-RAN subsystems, a set of O-RU, O-DU and O-CU-CP/O-CU-UP which
19 interconnects with another set of O-RU, O-DU and O-CU-CP/O-CU-UP (refer Section 4.6). This O-RAN setup is used
20 to test the video calling service during a handover. A 5G SA Core would be required to support basic functionality to
21 authenticate and register the end user device to establish a PDU session. An IMS core will be required to register the
22 end user device to support voice services on a 5G network. The 5G and IMS cores could be a completely emulated,
23 partially emulated or real non-emulated cores. We will need at least two end user devices (UE) which can be a real UEs
24 or an emulated one, and both have to support video calling service using Video over NR. The end user devices will
25 serve as Mobile Originated (MO) and Mobile Terminated (MT) end user devices forming the two ends of the video call.
26 Going forward in this section, these end user devices will be referred to as MO end user device and MT end user device
27 to represent their role in the video call. The test setup should include tools which have the ability to collect traces on the
28 elements and/or packet captures of communication between the elements. This could be a built-in capability of the
29 emulated/non-emulated network elements or an external tool. Optionally, if some of the network elements are located
30 remotely either in a cloud or on the internet, the additional latency should be calculated and accounted for.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 171
O-RAN.TIFG.E2E-Test.0-v04.00

1 The pair of O-RAN (O-RAN1 and O-RAN2) subsystems will be connected – O-RU1 will be connected to O-DU1,
2 which in turn will be connected to O-CU-CP/O-CU-UP1 and similarly O-RU2 will be connected to O-DU2, which in
3 turn will be connected to O-CU-CP/O-CU-UP2. The O-CU-CP/O-CU-UP1 and O-CU-CP/O-CU-UP2 nodes will be
4 connected to each other and the 5G core. All the O-RAN components need to have the right configuration and software
5 load. The end user devices must be configured with the right user credentials to be able to register and authenticate with
6 the O-RAN system and the 5G core. The end user devices also need to be provisioned with the user credentials to
7 register and setup PDU session with the 5G core and register with the IMS core to perform video call using Video over
8 NR. The 5G core network and IMS core must be configured to support video calling service on the end user devices
9 used for testing, which includes the ability to dynamically set up QoS Flows for voice calls.

10 All the elements in the network like O-RAN system, 5G core and the IMS Core need to have the ability to capture
11 traces to validate the successful execution of the test cases. The end user devices need to have the capability to capture
12 traces/packet capture to calculate the Video over NR KPIs. Optionally, the network could have network taps deployed
13 in various legs of the network to get packet captures to validate successful execution of the test cases. Finally, all these
14 different components need to have connectivity with each other – the end user device should be able to connect to O-
15 RAN system(O-RUs connected to O-DUs which are connected to the O-CU-CP/O-CU-UPs), O-RAN system needs to
16 be connected to the 5G core which in turn should have connectivity to the IMS Core.

17 6.8.4.3 Test Methodology/Procedure

18 Ensure the end user devices, O-RAN system, 5G core and the IMS Core have all been configured as outlined in Section
19 6.8.4.2. In this test scenario, both the mobile originated and mobile terminated end user devices are going to use the
20 same O-RAN system, 5G and IMS core to perform the end-to-end video call. All traces and packet captures need to be
21 enabled for the duration of the testing to ensure all communication between network elements can be captured and
22 validated.

23 1. Power on the two end user devices and ensure both of devices registers with the 5G core for video calling services
24 over SA by connecting over the O-RAN1 system (O-RU1, O-DU1 and O-CU-CP/O-CU-UP1). Ensure both the
25 MO & MT end user devices are in the coverage area of the same O-RU – O-RU1.
26 2. Once the registration is complete, the MO and MT end user devices have to establish a PDU session with the 5G
27 core. Once the PDU session has been setup, both the end user devices have to register with the IMS core to support
28 video calling services.
29 3. Use the MO end user device to video call the MT end user device. Validate the MT end user device is able to
30 receive and answer the video call.
31 4. Once the call has been setup, move the MO end user device from the coverage area of O-RU1 to coverage area of
32 O-RU2, thus causing a handover from O-RAN1 to O-RAN2 subsystem - O-RU1 to O-RU2, O-DU1 to O-DU2 and
33 O-CU-CP/O-CU-UP1 to O-CU-CP/O-CU-UP2.
34 5. Continue the two-way video and voice communication between MO and MT end user devices until the handover
35 procedure is complete before terminating the video call.
36 6. Repeat the test (steps 1 through 5) multiple times (> 10 times) and gather results.
37

38 6.8.4.4 Test Expectation (expected results)

39 As a pre-validation, use the traces to validate a successful registration and PDU session setup by the end user devices
40 without any errors over O-RU, O-DU and O-CU-CP/O-CU-UP. Also validate both the end user devices are able to
41 register with the IMS core for video calling services. This is a prerequisite before these tests can be validated.

42 Validate the end user devices are able to make a video call between each other by dynamically setting up a 5QI-2 bearer
43 to transfer voice and video packets over RTP. Ensure the Call Setup Time is reasonable, the voice and video quality
44 from both parties are clear and audible without one-way or intermittent muting and video freezing, through the duration
45 of the call, especially during the handover process. Use the packet captures to validate there is no RTP packet drops or

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 172
O-RAN.TIFG.E2E-Test.0-v04.00

1 high RTP packet jitter which could cause voice muting, video freezing or video lag issues. Use the packet captures to
2 ensure there are no out-of-sequence packets which could impact customer’s voice experience.

3 The average range of values for the KPIs are included below for guidance. These values are applicable when the testing
4 is performed in a controlled environment in good radio condition without the interference of external factors which
5 could impact the KPIs, example: use of internet to connect to remote servers/hosts could add latency, jitter and
6 reliability issues to the connection, thus impacting the KPIs. As there are multiple variables which can impact the
7 testing in this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any test results
8 outside the range of KPI encountered during testing, will have to be investigated to identify the root cause as the issues
9 may be due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be resolved and
10 the test has to be repeated.

11 • CSSR – Call Setup Success Rate % –> 99%.


12 • CST – Call Setup Time – < 2.5s
13 • MOS Score – > 3.5
14 • Mute Rate % – < 1%
15 • One Way Call % - < 1%
16 • RTP Packet Loss % - < 1%
17

18 As a part of gathering data, ensure the minimum configuration parameters (see Section 3.3) are included in the test
19 report. The following information should also be included in the test report to provide a comprehensive view of the test
20 setup.

21 UE side (real or emulated UE):

22 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

23 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

24 • Downlink transmission mode

25 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
26 (average sample per second)

27 Table 6-31 Example Test Report for Video over NR – Inter-O-CU Handover

Video over NR MO Video over NR MT

Call Setup Success Rate


Call Setup Time
MOS Score
Mute Rate %
One Way Call %
RTP Packet Loss %
L1 DL Spectral efficiency [bps/Hz]
UE RSRP [dBm]
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
DL RB number

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 173
O-RAN.TIFG.E2E-Test.0-v04.00

UE CQI
UE RSRQ
UE PMI
UE RSSI
UE Buffer status
UE Packet delay
PDSCH BLER [%]
1 The end user video calling service experience can also be impacted by some of the features available (see Section 6.8)
2 on the O-RU, O-DU and O-CU-CP/O-CU-UP. The details of these features, along with any other features/functionality
3 which could impact the end user’s video calling service experience should be included in the test report to provide a
4 comprehensive view of the testing.

5 6.9 URLLC
6 The 5G network has been built not just to support the data and voice use-cases of the past, but a wide variety of use-
7 cases to support connectivity across a diverse set of verticals like industrial automation, health care, gaming etc. 5G
8 network was built with flexibility and efficiency with additional enhancements being added in every release to enable
9 ultra-low latency and/or ultra-reliability to cater to a diverse set of use-cases such as automotive applications, augment
10 and virtual reality etc. Network Slicing in 5G allows the separation of the network where services with different
11 characteristics and requirements can be run on its own slice of the network. These enhancements in 5G paired up with
12 network slicing capability allows a telecommunication service provider to support all the diverse set of use-cases using
13 the same 5G network on different network slices.

14 Ultra-Reliable Low Latency Communication (URLLC) is one such slice which allows service which require low E2E
15 latency to be deployed. As this is a new concept, new services and use-cases are being created to use this slice of the
16 network. The generic KPIs which will be monitored to assess the URLLC use cases are included below. As the URLLC
17 use cases vary, the KPI used and the values for these KPIs also vary by use case. Not all KPIs are used for all use cases
18 and similarly some KPIs are used only in specific use cases.

19 • Reliability – This is the percentage of packets or messages sent which were successfully delivered to the
20 end node.
21 • End to End Latency – Time taken to transfer a packet/message from the end user device to the application
22 or vice versa.
23 • Jitter – This is the variation of the end-to-end latency values seen in the network setup.
24 • Position Service Accuracy – This is the distance between the location provided by the location service and
25 the real location of the target object.
26 • Position Service Latency – Time elapsed between the request to get location information to when the
27 location information is available.
28 • User experienced throughput – The throughput as measured on the application layer.
29 • Mean time between failures – This is the duration of time the service was available before a failure
30 condition which causes the service to be unavailable.
31

32 This section of the document gives the procedure to test the URLLC applications in a telecommunication network.
33 URLLC refers to a diverse set of applications where the end-user device communicates to the application server over
34 the 5G network. The use-case itself has stringent requirements for the end-to-end latency and/or reliability between the
35 end-user device and application server. 5G has enhancements to its RAN and core to support these requirements to
36 support these use cases. The 5G RAN allows for lower downlink data to uplink Ack, thus reducing the latency on the air
37 interface. Similarly, 5G Core allows for the data plane anchor (UPF) to be deployed close to the end user device in the
38 edge location along with the application server to eliminate part of the backhaul latency.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 174
O-RAN.TIFG.E2E-Test.0-v04.00

1 Please see below summary table of test cases and applicable technology.

2 Table 6-32: Augmented Reality Test Case summary

Applicable technology

NSA
LTE

SA
Test case Functional group

Test ID URLLC-Augmented Reality


6.9.1 Augmented Reality Service N/A N/A Y
3
4

5 6.9.1 Augmented Reality


6 This test scenario validates the end user’s augmented reality experience when using an AR device over the 5G network.

7 6.9.1.1 Test Description

8 This section of the document is using Augmented Reality (AR) as an example, where the end user device is an AR
9 device which communicates with an image processing server which acts as the application server. AR itself has a wide
10 range of applications including gaming, augmented worker as a part of smart factory, remote maintenance, ad-hoc
11 support from remote expert and control of heavy equipment etc. AR is one such use case where AR device has a camera
12 and continuously transmits images in real time to an application server, along with the position of the AR device. The
13 application server in this case would be an image processing server, which render the augmented image and transmit
14 down the augmented video stream back to the AR device, which is then displayed on the AR device. Here the entire
15 process of uploading the image from the AR device to the processing on the application server, and the displaying of the
16 augmented image on the AR device has to occur within milliseconds with high reliability to ensure the end user has a
17 realistic augmented reality experience.

18 The 5G URLLC network slice does not have requirements on the protocol, or the method used to communicate between
19 the end user device and the application server. The 5G core acts as a pipe to allow communication between these two
20 end points, AR device and application server, with high data rate, reliability and latency requirements as dictated by the
21 application. Optionally, depending on the AR use case two additional KPIs could also be monitored when audio-visual
22 interaction is characterized by a human being interacting with entities or humans by relying on audio-visual feedback.

23 • Motion to photon – This is the latency between the physical movement of the user’s head and the updated
24 image on the AR device.
25 • Motion to audio – This is the latency between the physical movement of the user’s head and the updated sound
26 waves from the AR device’s speakers.
27

28 6.9.1.2 Test Setup

29 The SUT in this test case would be a gNB (O-RU, O-DU and O-CU-CP/O-CU-UP) which is used to test the URLLC
30 use case. A 5G SA Core would be required to support basic functionality to authenticate and register the end user device
31 to establish a PDU session. An image processing server would act as an application server for this test scenario. The 5G
32 core and application server could be a completely emulated, partially emulated or real non-emulated core and image
33 processing server. We will need an end user device which would be an AR device for this test scenario, which could a
34 customized device like an AR based eyeglass, or an AR based head mounted device or a generic handset which support
35 AR or any AR end user device which is needed to support this use case. For the testing of this use case, we could use a
36 real end user device or an emulated one. The test setup should include tools which have the ability to collect traces on

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 175
O-RAN.TIFG.E2E-Test.0-v04.00

1 the elements and/or packet captures of communication between the elements. This could be a built-in capability of the
2 emulated/non-emulated network elements or an external tool. Optionally, if some of the network elements are located
3 remotely either in a cloud or on the internet, the additional latency should be calculated and accounted for.

4 The O-RU, O-DU and O-CU-CP/O-CU-UP need to have the right configuration and software load. The SUT will also
5 need to be setup to run this testing in different radio conditions as outlined in Section 3.6. The end user device must be
6 configured with the right user credentials to be able to register and authenticate with the O-RAN system and the 5G
7 core. The 5G core maybe distributed with the UPF deployed at the edge location along with the application server to
8 support the stringent latency requirement of the AR use case. The end user device also needs to be provisioned with the
9 user credentials to register and setup PDU session with the 5G core and authenticate/communicate with the application
10 server. The application server must be configured and provisioned with all the necessary information, including relevant
11 images and data to support the AR use case. The locations where the radio conditions are excellent, good, fair and poor
12 need to be identified within the serving cell.

13 All the elements in the network like O-RAN system, 5G core and the application server need to have the ability to
14 capture traces to validate the successful execution of the test cases. The end user devices need to have the capability to
15 capture traces/packet capture to calculate the KPIs. Optionally, the network could have network taps deployed in
16 various legs of the network to get packet captures to validate successful execution of the test cases. Finally, all these
17 different components need to have connectivity with each other – the end user device should be able to connect to O-
18 RAN system (O-RU, O-DU and O-CU-CP/O-CU-UP), O-RAN system needs to be connected to the 5G core which in
19 turn should have connectivity to the application server.

20 6.9.1.3 Test Methodology/Procedure

21 Ensure the end user devices, O-RAN system, 5G core and the application server have all been configured as outlined in
22 section 6.9.1.2. All traces and packet captures need to be enabled for the duration of the testing to ensure all
23 communication between network elements can be captured and validated.

24 1. Power on the end user device in excellent radio condition and ensure the device registers with the 5G core over SA
25 by connecting over the O-RAN (O-RU, O-DU and O-CU-CP/O-CU-UP).
26 2. Once the registration is complete, the end user device has to establish a PDU session over the URLLC network
27 slice, followed by connection to the application server.
28 3. Use the end user device to communicate with the application server for an AR session. Depending on the end user
29 device, this could be done using a finger and/or hand to select the right AR application, or using a button or touch
30 activated system to select the right AR application.
31 4. Once the application has started, complete the designated task which could be playing a game or performing a pre-
32 defined task on a machinery, troubleshooting etc.
33 5. Repeat the test multiple times (> 10 times) and gather results.
34 6. Repeat the above steps 1 through 5 for the good, fair and poor radio conditions.
35

36 6.9.1.4 Test Expectation (expected results)

37 As a pre-validation, use the traces to validate a successful registration and PDU session setup to the URLLC network
38 slice by the end user device without any errors over O-RU, O-DU and O-CU-CP/O-CU-UP. Also validate the end user
39 device is able to connect to the application server for the AR session. This is a prerequisite before these tests can be
40 validated.

41 Validate the end user device is able to setup the AR session and perform the desired task eg: Remote maintenance.
42 Ensure there is no noticeable video freezing, jitter or lag in the AR session for the end user. Use packet captures to
43 validate there is no jitter or packet drops between the end user device and the application server. Use the packet captures
44 to ensure there are no out-of-sequence packets which could impact the end user’s experience. Calculate the metrics like
45 latency and throughput using the packet captures.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 176
O-RAN.TIFG.E2E-Test.0-v04.00

1 The KPIs for an Augmented Reality session varies by the use-case type, the vertical that is being supported and even the
2 product that is being used for testing. As there is no standard application layer protocol or method used for these use-
3 cases, vendors have developed products which has been optimized using their own propriety mechanism. The specific
4 values for these KPIs depends on the product used for testing and has to be provided by the product vendor. As there are
5 multiple variables which can impact the testing in this scenario, a KPI outcome outside the range does not necessarily
6 point to a failed test case. Any test results outside the range of KPI encountered during testing, will have to be
7 investigated to identify the root cause as the issues may be due to the variables outside the SUT. If the root cause is not
8 in the SUT, then the issues have to be resolved and the test has to be repeated.

9 The KPIs for the URLLC session varies by use-case category, use-case type and even by the vendor products used for
10 testing. The table below [34] gives a generic list of KPI values defined by use-case category. The specific values for
11 these KPIs depends on the product used for testing and has to be provided by the product vendor.

12 Table 6-33 - URLLC KPIs


user time device/
Use case e2e round e2e network network
Use case example jitter experienced availability synchronous connection
group latency trip time reliabilit reliabilit throughput throughput accuracy density
y y
Augmented
worker 10ms 99.9999%
AR/VR
40 - 2
VR view broadcast <20ms 99.999% 700Mbps 3000/km

Tactile <7ms 2
interactio Cloud Gaming (uplink) 99.999% 1 Gbps 3000/km
n
Differential 2
protection <15ms <160us 99.999% 2.4Mbps 10us 10-100/km
2
FISR <25ms 10 Mbps 10/km
Fault location 2
Energy 140ms 2ms 99.9999% 100 Mbps 5 us 10/km
identification
fault mgmt. in
distr. Power <30ms 99.999% 1Mbps 99.999% <2000/km
2

generation
<30ms
99.9999%
task
Advanced to
<2ms planner;
industrial robotics 99.999999
<1-5ms
robot ctrl %

100 kbps
(downlink)
AGV control 5ms 99.999% 3-8Mbps
Factory
of the uplink
future 1ms
robotic
motion
Robot tooling ctrl; <50% 99.9999%
1-10ms
machine
ctrl

UTM connectivity 99.999% < 128 bps


UAV Cmnd & Ctrl <100ms 99.999%
Payload application dependent

for AR in
Position <15ms 99.9%
smart factory
measure-
ment
delivery for inbound
logistics in <10ms 99.9%
manufacturing

13

14 The capability to capture data is dependent on the end user device’s capability. As a part of gathering data, it is
15 recommended that minimum configuration parameters (see Section 3.3) are included in the test report. The following
16 information is also recommended to be included in the test report to provide a comprehensive view of the test setup.

17 End user device (real or emulated AR device):

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 177
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

2 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

3 • Downlink transmission mode

4 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
5 (average sample per second)

6 Table 6-34 Example Test Report for URLLC Testing

Excellent Poor
URLLC KPIs Good Fair
(cell centre) (cell edge)
Reliability
End to End Latency
User Experienced
Throughput
Motion-to-Photon
latency
Motion-to-sound delay
HARQ Retransmission
DL/UL Latency – User
Plane
DL/UL Latency –
Control Plane
L1 DL throughput
[Mbps]
L1 DL Spectral
efficiency [bps/Hz]
L3 DL PDCP
throughput [Mbps]
Application DL
throughput [Mbps]
UE RSRP [dBm]
UE RSRQ
UE PDSCH SINR [dB]
MIMO rank
PDSCH MCS
CQI
PMI
RSSI
Packet delay
PDSCH BLER [%]

7 6.10 mMTC
8 The 5G network also supports a wide set of use cases which allow machines type devices to communicate with network
9 to realise the Internet of Things use cases. These use cases fall under a category called mMTC – massive Machine Type
10 Communication. There are a diverse set of use cases under this category which supports a range of verticals including
11 security, healthcare, remote control, metering, smart city to name of few. The 5G network has enhanced the RAN and
12 the core to support mMTC requirements. Some of these enhancements include support large number of mMTC devices

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 178
O-RAN.TIFG.E2E-Test.0-v04.00

1 by improving coverage and density of devices. Enhancements also include reducing the overhead for small intermittent
2 data transmission, thus ensuring low energy usage, and extending the battery life of the mMTC devices.

3 Network slicing in 5G allows for a telecommunication service provider to support these diverse set of mMTC use cases
4 by dedicating a separate slice of the network for these use cases. 3GPP has also defined a massive Machine Type
5 Communication (mMTC) as a slice which is used to support the use case to support very large number of small devices
6 (million to billions) in an efficient way so as to ensure optimal energy utilization. This section of the document includes
7 scenarios to test service(s) which use the mMTC slice.

8 Please see below summary table of test cases and applicable technology.

9 Table 6-35: mMTC-Sensors Test Case Selection summary

Applicable technology

NSA
LTE

SA
Test case Functional group

Test ID mMTC-Sensors
6.10.1 Augmented Reality Service N/A N/A Y
10
11

12 6.10.1 Sensors

13 This test scenario validates the working of sensors over an mMTC slice on a 5G network.

14 6.10.1.1 Test Description

15 This section of the document gives the procedure to test the mMTC applications in a telecommunication network. As
16 already mentioned, mMTC comprises of a diverse set of use cases supported across a various vertical. Most of the
17 mMTC use cases include an end user equipment which communicates with an application server over the 5G network.
18 The end user device varies based on the use case from a sensor which measure temperature or humidity or weight etc to
19 a gauge which measures usage of electricity or water etc to a monitoring device which monitor vital statistics. These
20 end user devices collect data and update the application server periodically. The application server collects the data,
21 stores it, analyses it, and can also perform additional tasks if needed based on the outcome of the analysis eg: turn off
22 equipment when the temperature is high etc.

23 This section of the document is using sensors to perform mMTC testing. The test case includes deploying 5G sensors in
24 the field to collect data and upload it to an application server. The sensors itself could vary based on the use case needed
25 to support. These sensors could be air humidity sensor, temperature sensor and soil moisture sensor used in conjunction
26 to identify the need to irrigate a farm. These sensors could also be used to identify gas leaks to alert the authorities or
27 turn off gas lines. These sensors could also be light/image sensors which can be used to identify if a parking spot is
28 occupied. There are numerous sensors which support a varied set of use cases, but this test case has been intentionally
29 left generic without specifying the type of sensor that will be used for testing.

30 The 5G mMTC network slice does not have requirements on the protocol, or the method used to communicate between
31 the end user device and the application server. The 5G core acts as a pipe to allow communication between these two
32 end points, the sensors and application server, with the low energy utilization and the ability to support massive number
33 of such sensors to connect.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 179
O-RAN.TIFG.E2E-Test.0-v04.00

1 6.10.1.2 Test Setup

2 The SUT in this test case would be a gNB (O-RU, O-DU and O-CU-CP/O-CU-UP) which is used to test the mMTC use
3 case. A 5G SA core would be required to support basic functionality to authenticate and register the end user device(s)
4 to establish a PDU session. An application server which has the capability to collect data from different end user
5 devices(s)/sensor(s) which will be used for testing. The 5G core and application server could be a completely emulated,
6 partially emulated or real non-emulated core. The end user device(s) in this scenario would be a single or a group of
7 sensors which can communicate over 5G. For the testing of this use case, we could use real end user device(s) or we
8 could use emulated devices which emulate the different sensor(s). The test setup should include tools which have the
9 ability to collect traces on the elements and/or packet captures of communication between the elements. This could be a
10 built-in capability of the emulated/non-emulated network elements or an external tool. Optionally, if some of the
11 network elements are located remotely either in a cloud or on the internet, the additional latency should be calculated
12 and accounted for.

13 The O-RU, O-DU and O-CU-CP/O-CU-UP need to have the right configuration and software load. The SUT will also
14 need to be setup to run this testing in different radio conditions as outlined in Section 3.6. The end user device must be
15 configured with the right user credentials to be able to register and authenticate with the O-RAN system and the 5G
16 core. The end user device also needs to be provisioned with the user credentials to register and setup PDU session with
17 the 5G core and authenticate/communicate with the application server. The application server must be configured and
18 provisioned with all the necessary information to connect to the different sensors and the values. The locations where
19 the radio conditions are excellent, good, fair and poor need to be identified within the serving cell.

20 All the elements in the network like O-RAN system, 5G core and the application server need to have the ability to
21 capture traces to validate the successful execution of the test cases. The end user devices need to have the capability to
22 capture traces/packet capture to calculate the KPIs. Optionally, the network could have network taps deployed in
23 various legs of the network to get packet captures to validate successful execution of the test cases. Finally, all these
24 different components need to have connectivity with each other – the end user device should be able to connect to O-
25 RAN system (O-RU, O-DU and O-CU-CP/O-CU-UP), O-RAN system needs to be connected to the 5G core which in
26 turn should have connectivity to the application server.

27 6.10.1.3 Test Methodology/Procedure

28 Ensure the end user devices, O-RAN system, 5G core and the application server have all been configured as outlined in
29 section 6.10.1.2. All traces and packet captures need to be enabled for the duration of the testing to ensure all
30 communication between network elements can be captured and validated.

31 1. Power on the end user device(s) in excellent radio condition and ensure the device registers with the 5G core over
32 SA by connecting over the O-RAN (O-RU, O-DU and O-CU-CP/O-CU-UP). Depending on the type of mMTC
33 device this could be done by pushing the power button on the end user device or could be by connecting the end
34 user device to a battery or power source.
35 2. Once the registration is complete, the end user device(s) has to establish a PDU session over the mMTC network
36 slice, followed by connection to the application server.
37 3. Based on the type of mMTC device, additional steps may be required the first time the device is connected to the
38 network, to enable the device to register with the application server. This could include adding the end user device
39 to the application server using a unique ID or/and unique security code etc.
40 4. Validate the end user device are able to communicate with the application server periodically and update the sensor
41 data.
42 5. Let the test run for long enough duration of time so that the end user device(s) is able to upload enough data to the
43 application server.
44 6. Repeat the above steps 1 through 5 for the good, fair and poor radio conditions.
45

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 180
O-RAN.TIFG.E2E-Test.0-v04.00

1 6.10.1.4 Test Expectation (expected results)

2 As a pre-validation, use the traces to validate a successful registration and PDU session setup to the mMTC network
3 slice by the end user devices without any errors over O-RU, O-DU and O-CU-CP/O-CU-UP. Also validate the end user
4 devices are able to connect to the application server. This is a prerequisite before these tests can be validated.

5 Validate the end user devices are able to connect to the application server and update the sensor data periodically.
6 Validate the application server is able to receive the data from the sensors, process it and take the necessary action –
7 updating charts and report, triggering secondary actions etc. Use packet captures to validate there is no packet drops
8 between the end user device and the application server. Calculate the metrics like latency and throughput using the
9 packet captures.

10 The KPIs for the mMTC session varies by use-case category, use-case type and even by the vendor products used for
11 testing. As there is no standard application layer protocol or method used for these use-cases, vendors have developed
12 products which has been optimized using their own propriety mechanism. The specific values for these KPIs depends
13 on the product used for testing and has to be provided by the product vendor. As there are multiple variables which can
14 impact the testing in this scenario, a KPI outcome outside the range does not necessarily point to a failed test case. Any
15 test results outside the range of KPI encountered during testing, will have to be investigated to identify the root cause as
16 the issues may be due to the variables outside the SUT. If the root cause is not in the SUT, then the issues have to be
17 resolved and the test has to be repeated.

18 The capability to capture data is dependent on the end user device’s capability. As a part of gathering data, it is
19 recommended that minimum configuration parameters (see Section 3.3) are included in the test report. The following
20 information is also recommended to be included in the test report to provide a comprehensive view of the test setup.

21 End user device (real or emulated sensors):

22 • Radio parameters such as RSRP, RSRQ, PDSCH SINR (average sample per second)

23 • PDSCH BLER, PDSCH MCS, MIMO rank (number of layers) (average sample per second)

24 • Downlink transmission mode

25 • Channel utilization, i.e. Number of allocated/occupied downlink PRBs and Number of allocated/occupied slots
26 (average sample per second)

27 Table 6-36 Example Test Report for mMTC Testing

Excellent Poor
MMTC KPIs Good Fair
(cell centre) (cell edge)
RACH Success
Rate
Paging Success
Rate
UE RSRP [dBm]
UE RSRQ
UE PDSCH SINR
[dB]
CQI
Packet delay
Buffer Status
DRX Sleep Time
28

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 181
O-RAN.TIFG.E2E-Test.0-v04.00

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 182
O-RAN.TIFG.E2E-Test.0-v04.00

1 Security
2 This chapter describes the tests evaluating and assessing the security aspects of the E2E of a radio access network. The
3 general test methodologies and configurations outlined in Chapter 3 should be followed.

4 Because the whole O-RAN system is the System under Test (SUT) and can be viewed as an integrated black box in the
5 context of the E2E testing, the security test cases listed here fall into the following categories:

6 1. gNB security assurance specification required by 3GPP SA3 (applicable to 5G-NR NSA/SA);

7 2. Optional: Denial of Service (DoS), fuzzing, and exploitation types of security test cases against the O-RAN
8 system executed on L2 (Ethernet) and L3 (IP) layers with easy to access target information (MAC/IP address)
9 (applicable to 5G-NR NSA/SA);

10 3. Optional: Resource exhaustion types of security test case against the underlying cloud infrastructure hosting
11 the O-RAN system and its network slice(s) (applicable to 5G-NR NSA/SA)

12 Any other security test cases requiring specific access to a major interface or an internal functionality of the O-RAN
13 component(s) is out of scope.

14 7.1 gNB Security Assurance Specification (SCAS) required by


15 3GPP SA3
16 An E2E O-RAN system (SUT) is equivalent to a gNB from both network architecture and functional perspectives,
17 therefore any gNB specific security requirements, threats and test cases outlined in 3GPP TS 33.511 [27] shall be
18 followed in this E2E of radio access network security evaluation and assessment. The following table illustrates all the
19 required test cases listed in Section 4.2.2 of 3GPP TS 33.511[27].

20 Table 7-1: List of gNB SCAS Test Cases and applicable technology

Test Test Test Name Description LTE NSA SA


Case Case
(O- (3GP
RAN P Ref.
Ref. #)
#)

7.1.1 4.2.2. Integrity Verify that


1.1 protection of the RRC-
RRC- signaling data
signalling sent between
UE and gNB
N/A Y Y
over the NG
RAN air
interface are
integrity
protected

7.1.2 4.2.2. Integrity Verify that


1.2 protection of the user data
user data packets are
between the integrity N/A N/A Y
UE and the protected over
gNB the NG RAN
air interface.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 183
O-RAN.TIFG.E2E-Test.0-v04.00

7.1.3 4.2.2. RRC Verify that


1.4 integrity RRC integrity
check failure check failure
N/A Y Y
is handled
correctly by
the gNB

7.1.4 4.2.2. UP integrity Verify that


1.5 check failure UP integrity
check failure
N/A Y Y
is handled
correctly by
the gNB.

7.1.5 4.2.2. Ciphering of Verify that


1.6 RRC- the RRC-
signalling signaling data
sent between
UE and gNB
N/A Y Y
over the NG
RAN air
interface are
confidentialit
y protected.

7.1.6 4.2.2. Ciphering of Verify that


1.7 user data the user data
between the packets are
UE and the confidentialit
N/A Y Y
gNB y protected
over the NG
RAN air
interface.

7.1.7 4.2.2. Replay Verify that


1.8 protection of the user data
user data packets are
between the replay N/A Y Y
UE and the protected over
gNB the NG RAN
air interface.

7.1.8 4.2.2. Replay Verify the


1.9 protection of replay
RRC- protection of
signalling RRC-
signaling N/A Y Y
between UE
and gNB over
the NG RAN
air interface.

7.1.9 4.2.2. Ciphering of Verify that


1.10 user data the user data
based on the packets are
security confidentialit
policy sent y protected N/A N/A Y
by the SMF based on the
security
policy sent by
the SMF via

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 184
O-RAN.TIFG.E2E-Test.0-v04.00

AMF

7.1.10 4.2.2. Integrity of Verify that


1.11 user data the user data
based on the packets are
security integrity
policy sent protected N/A N/A Y
by the SMF based on the
security
policy sent by
the SMF.

7.1.11 4.2.2. AS Verify that


1.12 algorithms the eNB/gNB
selection selects the
algorithms
with the N/A N/A Y
highest
priority in its
configured
list.

7.1.12 4.2.2. Key refresh Key refresh at


N/A N/A Y
1.13 at the gNB gNB

7.1.13 4.2.2. Bidding Verify that


1.14 down bidding down
prevention is prevented N/A N/A Y
in Xn- in Xn-
handovers handovers.

7.1.14 4.2.2. AS Verify that


1.15 protection AS
algorithm (Algorithm
selection in Selection)
N/A Y Y
gNB change protection
algorithm is
selected
correctly

7.1.15 4.2.2. Control Verify the


1.16 plane data control plane
confidentiali data
ty protection confidentialit N/A N/A Y
over N2/Xn y protection
interface over N2/Xn
interface

7.1.16 4.2.2. Control Verify the


1.17 plane data control plane
integrity data integrity
N/A Y Y
protection protection
over N2/Xn over N2/Xn
interface interface

7.1.17 4.2.2. Key update Key update at


1.18 at the gNB the gNB on
on dual dual N/A Y Y
connectivity connectivity –
2 test cases

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 185
O-RAN.TIFG.E2E-Test.0-v04.00

2 7.2 Optional: DoS, fuzzing and blind exploitation types of security


3 test
4 Due to the open and disaggregated nature of the O-RAN system (SUT), the attack surfaces associated with some of its
5 critical transport protocols and major interfaces of the O-RAN system become easy targets for potential attackers.
6 Cyberattacks like DoS, fuzzing and blind exploitation types are easy to launch, require little information on the target
7 system, and could cause significant performance degradation, or even the service interruption if not properly mitigated.

8 As the WG1 Security Task Group (STG) is continuously working on the O-RAN system security analysis and security
9 requirements specifications, the proposed security test cases in this section are optional for this E2E test specification
10 release.

11 As we learn more about the O-RAN system security posture against these types of attacks through the proposed test
12 cases, we will reevaluate their necessity in the subsequent E2E test specification releases.

13 The duration of test TRAFFIC GENERATION specified in this section should be at least 3 minutes.
14 Please see below summary table of test cases and applicable technology.

15 Table 7-2: DoS and Fuzzing Test Case Selection summary

Applicable technology
Functional

NSA
LTE

SA
Test case
group
Test ID DoS and fuzzing types of security test Security
7.2.1 S-Plane PTP DoS Attack Security N/A Y Y
7.2.2 C-Plane eCPRI DoS Attack Security N/A Y Y
7.2.3 NRT-RIC A1 Interface DoS Attack Security N/A Y Y
7.2.4 S-Plane PTP Unexpected Input Security N/A Y Y
7.2.5 C-Plane eCPRI Unexpected Input Security N/A Y Y
7.2.6 NRT-RIC A1 Interface Unexpected Input Security N/A Y Y
16

17 7.2.1 S-Plane PTP DoS Attack (Network layer)

18 7.2.1.1 Test description & applicability

19 The purpose of the test is to verify that a predefined volumetric DoS attack against O-DU S-Plane will not degrade
20 service availability or performance of the SUT in a meaningful way.

21 7.2.1.2 Test setup and configuration

22 The test requires easy to access MAC address information of the O-DU’s open fronthaul interface and L2 connectivity
23 (e.g. over L2 network switching device) to the target from the emulated attacker.

24 Please refer to the diagram below for the test setup and configuration:

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 186
O-RAN.TIFG.E2E-Test.0-v04.00

S-Plane DoS attack

UE or Emulated UE 5GC or Emulated 5GC

2 7.2.1.3 Test procedure

3 • Ensure normal UE procedures1 and user-plane traffic can be handled properly through the SUT. It is
4 recommended to use Section 5.6 Bidirectional throughput in different radio conditions and Section 6.1 Data
5 Services tests as a benchmark for indicating correct behavior of the SUT.

6 • Use test tool to generate various level of volumetric DoS attack against the MAC address of the O-DU S-Plane

7 o Volumetric tiers: 10Mbps, 100Mbps, 1Gbps

8 o DoS Traffic types: generic Ethernet frames, PTP announce/sync message

9 o DoS source address: spoofed MAC of PTPGM, random source MACs

10 • Observe the functional and performance impact of the SUT

11 7.2.1.4 Test requirements (expected results)

12 • No meaningful degradation of service availability and performance of the SUT. It is recommended to use Sec.
13 5.6 Bidirectional throughput in different radio conditions and Section 6.1 Data Services tests as a benchmark
14 for indicating correct behavior of the SUT.

15 • DoS attacks should be mitigated by O-DU

16 7.2.2 C-Plane eCPRI DoS Attack (Network layer)

17 7.2.2.1 Test description & applicability

18 The purpose of the test is to verify that a predefined volumetric DoS attack against O-DU C-Plane will not degrade
19 service availability or performance of the SUT in a meaningful way.

1
UE procedures include initial acquisition, registration/de-registration, attach/detach, paging, service
request, handover, …

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 187
O-RAN.TIFG.E2E-Test.0-v04.00

1 7.2.2.2 Test setup and configuration

2 The test requires easy to access MAC address information of the O-DU’s open fronthaul interface and L2 connectivity
3 (e.g. over L2 network switching device) to the target from the emulated attacker.

4 Please refer to the diagram below for the test setup and configuration:

C-Plane DoS attack

UE or Emulated UE 5GC or Emulated 5GC

6 7.2.2.3 Test procedure

7 • Ensure normal UE procedures1 and user-plane traffic can be handled properly through the SUT. It is
8 recommended to use Section 5.6 Bidirectional throughput in different radio conditions and Section 6.1 Data
9 Services tests as a benchmark for indicating correct behavior of the SUT.

10 • Use test tool to generate various level of volumetric DoS attack against the MAC address of the O-DU C-Plane

11 o Volumetric tiers: 10Mbps, 100Mbps, 1Gbps

12 o DoS Traffic types: eCPRI real-time ctrl data message over Ethernet

13 o DoS source address: spoofed MAC of O-RU(s), random source MACs

14 • Observe the functional and performance impact of the SUT

15 7.2.2.4 Test requirements (expected results)

16 • No meaningful degradation of service availability and performance of the SUT. It is recommended to use
17 Section 5.6 Bidirectional throughput in different radio conditions and Section 6.1 Data Services tests as a
18 benchmark for indicating correct behavior of the SUT.

19 • DoS attacks should be mitigated by O-DU.

20 7.2.3 Near-RT RIC A1 Interface DoS Attack (Network layer)

21 7.2.3.1 Test description & applicability

22 The purpose of the test is to verify that a predefined volumetric DoS attack against Near-RT RIC A1 interface will not
23 degrade service availability or performance of the SUT in a meaningful way

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 188
O-RAN.TIFG.E2E-Test.0-v04.00

1 7.2.3.2 Test setup and configuration

2 The test requires easy to access IP address information of the Near-RT RIC’s A1 interface and a routable path to the
3 target from the emulated attacker.

4 Please refer to the diagram below for the test setup and configuration:

A1 interface DoS attack

UE or Emulated UE 5GC or Emulated 5GC

6 7.2.3.3 Test procedure

7 • Ensure normal UE procedures1 and user-plane traffic can be handled properly through the SUT. It is
8 recommended to use Section 5.6 Bidirectional throughput in different radio conditions and Section 6.1 Data
9 Services tests as a benchmark for indicating correct behavior of the SUT.

10 • Use test tool to generate various level of volumetric DoS attack against the IP address of the Near-RT RIC A1
11 interface

12 o Volumetric tiers: 10Mbps, 100Mbps, 1Gbps

13 o DoS Traffic types: generic UDP packets, HTTP/HTTPs REST API calls

14 o DoS source address: spoofed IP of Non-RT RIC, random source IPs or broadcast IP (UDP only)

15 • Observe the functional and performance impact of the SUT

16 7.2.3.4 Test requirements (expected results)

17 • No meaningful degradation of service availability and performance of the SUT. It is recommended to use
18 Section 5.6 Bidirectional throughput in different radio conditions and Section 6.1 Data Services tests as a
19 benchmark for indicating correct behavior of the SUT. DoS attacks should be mitigated by O-RAN system
20 perimeter security or Near-RT RIC.

21 7.2.4 S-Plane PTP Unexpected Input (Network layer)

22 7.2.4.1 Test description & applicability

23 The purpose of the test is to verify that an unexpected (not in-line with protocol specification) input sent towards O-DU
24 S-Plane will not compromise the security of the SUT

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 189
O-RAN.TIFG.E2E-Test.0-v04.00

1 7.2.4.2 Test setup and configuration

2 The test requires easy to access MAC address information of the O-DU’s open fronthaul interface and L2 connectivity
3 (e.g. over L2 network switching device) to the target from the emulated attacker.

4 Please refer to the diagram below for the test setup and configuration:

S-Plane Unexpected Input

UE or Emulated UE 5GC or Emulated 5GC

6 7.2.4.3 Test procedure

7 • Ensure normal UE procedures1 and user-plane traffic can be handled properly through the SUT. It is
8 recommended to use Section 5.6 Bidirectional throughput in different radio conditions and Section 6.1 Data
9 Services tests as a benchmark for indicating correct behavior of the SUT.

10 • Use packet capture tool to capture sample of legitimate PTP message sent towards the O-DU S-Plane

11 • Use fuzzing tool to replay the captured PTP message while mutating its content and keeping original
12 source/destination MAC address. Send at least 250,000 iterations of mutated PTP message based on a random
13 seed

14 • Observe the functional and performance impact of the SUT

15 7.2.4.4 Test requirements (expected results)

16 • No meaningful degradation of service availability and performance of the SUT. It is recommended to use
17 Section 5.6 Bidirectional throughput in different radio conditions and Section 6.1 Data Services tests as a
18 benchmark for indicating correct behavior of the SUT.

19 7.2.5 C-Plane eCPRI Unexpected Input (Network layer)

20 7.2.5.1 Test description & applicability

21 The purpose of the test is to verify that an unexpected (not in-line with protocol specification) input sent towards O-DU
22 C-Plane will not compromise the security of the SUT

23 7.2.5.2 Test setup and configuration

24 The test requires easy to access MAC address information of the O-DU’s open fronthaul interface and L2 connectivity
25 (e.g. over L2 network switching device) to the target from the emulated attacker.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 190
O-RAN.TIFG.E2E-Test.0-v04.00

1 Please refer to the diagram below for the test setup and configuration:

C-Plane Unexpected Input

UE or Emulated UE 5GC or Emulated 5GC

3 7.2.5.3 Test procedure

4 • Ensure normal UE procedures1 and user-plane traffic can be handled properly through the SUT. It is
5 recommended to use Section 5.6 Bidirectional throughput in different radio conditions and Section 6.1 Data
6 Services tests as a benchmark for indicating correct behavior of the SUT.

7 • Use packet capture tool to capture sample of legitimate eCPRI message sent towards the O-DU C-Plane

8 • Use fuzzing tool to replay the captured eCPRI message while mutating its content and keeping original
9 source/destination MAC address. Send at least 250,000 iterations of mutated eCPRI message based on a random
10 seed

11 • Observe the functional and performance impact of the SUT

12 7.2.5.4 Test requirements (expected results)

13 • No meaningful degradation of service availability and performance of the SUT. It is recommended to use
14 Section 5.6 Bidirectional throughput in different radio conditions and Section 6.1 Data Services tests as a
15 benchmark for indicating correct behavior of the SUT.

16 7.2.6 Near-RT RIC A1 Interface Unexpected Input (Network layer)

17 7.2.6.1 Test description & applicability

18 The purpose of the test is to verify that an unexpected (not in-line with protocol specification) input sent towards Near-
19 RT RIC A1 interface will not compromise the security of the SUT

20 7.2.6.2 Test setup and configuration

21 The test requires easy to access IP address information of the Near-RT RIC’s A1 interface and a routable path to the
22 target from the emulated attacker.

23 Please refer to the diagram below for the test setup and configuration:

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 191
O-RAN.TIFG.E2E-Test.0-v04.00

A1 interface Unexpected Input

UE or Emulated UE 5GC or Emulated 5GC

2 7.2.6.3 Test procedure

3 • Ensure normal UE procedures1 and user-plane traffic can be handled properly through the SUT. It is
4 recommended to use Section 5.6 Bidirectional throughput in different radio conditions and Section 6.1 Data
5 Services tests as a benchmark for indicating correct behavior of the SUT

6 • Use packet capture tool to capture sample of legitimate HTTP/HTTPs REST API message sent towards the
7 Near-RT RIC A1 interface

8 • Use fuzzing tool to replay the captured HTTP/HTTPs REST API message while mutating its content and
9 keeping original source/destination IP/port. Send at least 250,000 iterations of mutated HTTP/HTTPs REST
10 API message based on a random seed

11 • Observe the functional and performance impact of the SUT

12 7.2.6.4 Test requirements (expected results)

13 • No meaningful degradation of service availability and performance of the SUT. It is recommended to use
14 Section 5.6 Bidirectional throughput in different radio conditions and Section 6.1 Data Services tests as a
15 benchmark for indicating correct behavior of the SUT.

16 7.2.7 Blind exploitation of well-known vulnerabilities over Near-RT RIC A1


17 interface (Network layer)

18 7.2.7.1 Test description & applicability

19 The purpose of the test is to verify that exploitation attempts of well-known vulnerabilities executed blindly against
20 Near-RT RIC A1 interface will not compromise security of the SUT

21 7.2.7.2 Test setup and configuration

22 The test requires easy to access IP address information of the Near-RT RIC’s A1 interface and a routable path to the
23 target from the emulated attacker.

24 Please refer to the diagram below for the test setup and configuration:

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 192
O-RAN.TIFG.E2E-Test.0-v04.00

Exploitation over A1 interface

UE or Emulated UE 5GC or Emulated 5GC

2 7.2.7.3 Test procedure

3 • Ensure normal UE procedures1 and user-plane traffic can be handled properly through the SUT. It is
4 recommended to use Section 5.6 Bidirectional throughput in different radio conditions and Section 6.1 Data
5 Services tests as a benchmark for indicating correct behavior of the SUT

6 • Make sure that vulnerability scanning tool has up-to-date database of well-known vulnerabilities
7 (signatures/plugins) based on Common Vulnerabilities and Exposures (CVE). Document the actual version of
8 vulnerability database (signatures/plugins) for further reference

9 • Use vulnerability scanning tool to execute a scan against the IP address of the Near-RT RIC A1 interface. The
10 scan should have the following parameters defined:

11 o TCP Ports: 0-65535

12 o UDP Ports: Top 1000

13 o Safe Checks: Disabled (to make sure that exploitation attempts of the vulnerabilities will be
14 performed)

15 • Observe the functional and performance impact of the SUT

16 7.2.7.4 Test requirements (expected results)

17 • No meaningful degradation of service availability and performance of the SUT. It is recommended to use
18 Section 5.6 Bidirectional throughput in different radio conditions and Section 6.1 Data Services tests as a
19 benchmark for indicating correct behavior of the SUT.

20 • Identification of well-known vulnerabilities is out-of-scope of the test

21

22 7.3 O-Cloud resource exhaustion type of security test (Virtualization


23 layer)
24 Because the O-RAN system is running on a virtualized platform, O-Cloud, security attacks against this underlying
25 infrastructure could have profound performance and service impacts on the O-RAN system.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 193
O-RAN.TIFG.E2E-Test.0-v04.00

1 Among various types of threats to the O-Cloud infrastructure, resource exhaustion attack, such as the O-Cloud side-
2 channel noisy neighbor attack, is one of the easiest to launch and could impact the network slice(s) performance,
3 especially on a small size edge cloud with shared physical hosts/resources.

4 As the WG1 Security Task Group (STG) is continuously working on the O-RAN system security analysis and security
5 requirements specifications, the proposed security test case in this section will be optional for this E2E test specification
6 release.

7 As we learn more about the O-RAN system security posture against these types of attacks through the proposed test
8 case, we will re-evaluate its necessity in the following E2E test specification release.

9 Please see below summary table of test cases and applicable technology.

10 Table 7-3: O-Cloud resource exhaustion Test Case Selection summary

Applicable technology
Functional

NSA
LTE

SA
Test case
group
TEST
O-Cloud resource exhaustion type of security test Security
ID
7.3.1 O-Cloud side-channel DoS Attack Security N/A N/A Y
11
12

13 7.3.1 O-Cloud side-channel DoS attack

14 7.3.1.1 Test description & applicability

15 The purpose of the test is to verify that a noisy neighbor DoS attack against O-Cloud for resource starvation will not
16 degrade service availability or performance of the O-RAN system served by the network slice under test.

17 7.3.1.2 Test setup and configuration

18 The test requires access to the O-Cloud platform hosting the network slice(s) of the O-RAN system.

19 Please refer to the diagram below for the test setup and configuration:

UE or Emulated UE 5GC or Emulated 5GC

O-Cloud

Side-channel DoS attack

20

21 Noisy Neighbor VNF(s) can be deployed into an existing slice or a new slice of the shared resources with the existing
22 slice under test

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 194
O-RAN.TIFG.E2E-Test.0-v04.00

1 7.3.1.3 Test procedure

2 • Ensure normal UE procedures1 and user-plane traffic can be handled properly through the E2E O-RAN system
3 served by the O-Cloud RAN slice under test. It is recommended to use Section 5.6 Bidirectional throughput in
4 different radio conditions and Section 6.1 Data Services tests as a benchmark for indicating correct behavior
5 of the SUT

6 • Use test tool (through O-Cloud MANO) to instantiate noisy neighbor VNFs

7 o Noisy Neighbor tenant: existing slice or a new slice of the shared resources with the existing slice

8 • Observe the functional and performance impact of the E2E O-RAN system served by the O-Cloud RAN slice
9 under test

10 7.3.1.4 Test requirements (expected results)

11 • No degradation of service availability and performance of the E2E O-RAN system served by the network slice
12 under test. It is recommended to use Section 5.6 Bidirectional throughput in different radio conditions and
13 Section 6.1 Data Services tests as a benchmark for indicating correct behavior of the SUT

14 • The Noisy Neighbor attack should be properly logged and alerted by the O-Cloud

15

16

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 195
O-RAN.TIFG.E2E-Test.0-v04.00

1 Load and Stress Tests


2 This chapter describes the tests evaluating and assessing the load and stress tests of the radio access network from a
3 network end-to-end perspective. The focus of the testing is on the tolerability of the eNB/gNB under load based on
4 3GPP and O-RAN specifications.
5
6 The eNB/gNB should have an ability to process the various traffic patterns which could happen in a real field. Load and
7 stress tests are used to evaluate the tolerability of eNB/gNB against load. Load and stress tests are a test methodology to
8 be performed in the laboratory to simulate the actual traffic load that eNB/gNB are subjected to in a real field. Load and
9 stress tests help to detect performance-related problems of the entire system, which are "less reproducible" and difficult
10 to solve in general. In some cases, even problems that could not be found by functional testing are discovered. Load and
11 stress tests in addition to Functional testing will help to ensure the final quality of the eNB/gNB products. It will lead to
12 improve the overall quality of the system, which in turn will improve the user experience.
13

14 Please see below summary table of test cases and applicable technology.

15 Table 8-1: Load and Stress Tests summary

Applicable technology

NSA
LTE

SA
Test case Functional group

TEST
Load and Stress Tests Load and Stress
ID
8.1 Simultaneous RRC_CONNECTED UEs Load and Stress Y Y Y
8.2 Benchmark of UE State Transition Load and Stress Y Y Y
8.3 Traffic Load Testing Load and Stress Y N/A Y
8.4 Traffic Model Testing Load and Stress Y Y Y
8.5 Long hours stability Testing Load and Stress Y Y Y
8.6 Multi-cell Testing Load and Stress Y Y Y
8.7 Emergency call Load and Stress Y Y Y
8.8 ETWS (Earthquake and Tsunami Warning System) Load and Stress Y Y Y
16

17 8.1 Simultaneous RRC_CONNECTED UEs

18 8.1.1 Test description and applicability

19 The purpose of this test is to connect multiple UEs to the eNB/gNB and measure the maximum number of UEs that can
20 be simultaneously maintained in RRC_CONNECTED state. By connecting multiple UEs at the same time, a basic
21 eNB/gNB capacity is verified. This test is valid with either LTE or 5G NSA/SA. The test procedure is to stack the
22 number of connected UEs to the eNB/gNB one by one and check the maximum number of simultaneous connected
23 UEs. To achieve this test, the connected UE needs to transmit a minimum size of U-Plane packet periodically such as
24 ping to keep the RRC_CONNECTED state. The transmit interval of the packets should be shorter than RRC inactivity
25 timer value in eNB/gNB. Figure 8-1 Simultaneous RRC_CONNECTED UEsillustrates how the test works.

26

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 196
O-RAN.TIFG.E2E-Test.0-v04.00

UE s
1,200
Number of Maximum connected UE
Failure
1,000

Call attempt RRC_CONNECTED Release

800

Call attempt RRC_CONNECTED Release Call attempt

600

Call attempt RRC_CONNECTED Release Release

400

Call attempt RRC_CONNECTED Release


Failure
200

Call attempt RRC_CONNECTED Release

T im e

1 C all interval C onnection holding tim e

2 Figure 8-1 Simultaneous RRC_CONNECTED UEs

3 8.1.2 Test setup and configuration

4 The test setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with
5 multiple UEs (emulated) placed under excellent radio conditions as defined in Section 3.6, using LTE RSRP (for LTE)
6 or 5G SS-RSRP (for 5G NSA/SA) as the metric.
7 Since this test scenario requires a large number of UE connections, using an equipment that emulates a large number of
8 UEs and performing the test is recommended.
9 This test is primarily C-Plane (i.e. RRC and RLC) capacity benchmarking and SUT capacity should be limited by O-
10 DU and O-CU capacities, not O-RU. If the O-DU and O-CU are not fully utilized by the configured RF cell and
11 associated UEs, the optional OFH and/or F1 interface and test equipment as described in Section 3.1.1 should be used in
12 addition to the configured RF cell and UEs to add more UE traffic directly at the interfaces to O-DU and O-CU
13 comprising the SUT.
14 Test configuration: The test configuration is not specified. The utilized test configuration (connection diagram between
15 SUT and test equipment, parameters) should be recorded in the test report.
16 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator inserted
17 between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using a UE emulator.
18 The test environment should be setup to achieve excellent radio conditions (LTE RSRP (for LTE) or 5G SS-RSRP (for
19 5G NSA/SA) as defined in Section 3.6) for the UE, but the minimum coupling loss (see Section 3.6) should not be
20 exceeded.

21 8.1.3 Test Procedure


22 The test steps below are applicable for either LTE or 5G NSA/SA:

23 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
24 report. The serving cell under test is activated and unloaded. All other cells are powered off.
25 2. The UEs (emulated UE) are placed under excellent radio conditions (Cell centre close to radiated eNB/gNB
26 Antenna) as defined by LTE RSRP (for LTE) or 5G SS-RSRP (for 5G NSA/SA) in Section 3.6.
27 3. The End-to-end setup shall be operational for LTE or 5G NSA/SA as applicable for the test scenario, and there
28 should not be any connectivity issues.
29 4. Required performance data (incl. the signalling and control data) as specified in the “Test requirements” Section
30 below is measured and captured at the UE(s) and eNB/gNB side using logging/measurement tools.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 197
O-RAN.TIFG.E2E-Test.0-v04.00

1 5. "Power ON" the UEs one by one, connect them to the LTE or 5G NSA/SA cell, and confirm that they are in the
2 RRC_CONNECTED state normally. The “Power ON” intervals may be better to longer than the attach latency
3 described in Chapter4. Increase the number of UEs until the newly powered UE fails to connect to LTE or 5G
4 NSA/SA cell. The UE in RRC_CONNECTED state, to keep their RRC connection, should send some data packets
5 like Ping. At this time, it is necessary to set the connection holding time at least 3 minutes in Figure 8 -1 to be
6 sufficiently long.
7 6. Check the measured RRC Connected UEs for the SUT and if the measured value is not consistent with expectations
8 for the entire SUT (O-DU/O-CU) being fully loaded, use the optional OFH and/or F1 interface and test equipment
9 (see Section 3.1.1) to add more UEs maintaining RRC_CONNECTED state until the entire SUT is fully utilized
10 (e.g. >80%) during execution of the test case procedure. After the holding time, check the maximum number of
11 UEs that were in the state of RRC_CONNECTED simultaneously.
12 7. Lost connections shall be re-established automatically to maximize number of RRC Connected UEs.
13 8. Stop and save the test logs. Check the log to make sure that the test runs successfully and that no unexpected
14 behavior such as unexpected call release is recorded. The logs should be captured and kept for test result reference
15 and measurements.

16 8.1.4 Test requirements (expected results)

17 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
18 should be captured and reported in the test report for performance assessment.

19 • Radio parameters such as RSRP, RSRQ


20 • KPIs mentioned in Table 8-2
21 • SUT load/capacity/performance related KPIs (e.g. CPU and MEMORY utilization) if any

22 Validate the successful procedures from the collected logs. Check the maximum number of UE connections. For the
23 UE(s) which call loss has occurred in this test more than a certain percentage (e.g. 2%), the validity of the cause of the
24 call loss shall be confirmed.

25 Table 8-2 Maximum Number of simultaneous RRC_CONNECTED UEs

Maximum Number of simultaneous


RRC_CONNECTED UEs
LTE
NR(NSA)
NR(SA)
26

27 8.2 Benchmark of UE State Transition

28 8.2.1 Test description and applicability

29
30 The purpose of this test is to verify the benchmark value of the number of UE state transitions that can be processed per
31 unit time by connecting multiple UEs to an eNB/gNB.
32 As shown in Figure 8-2, a certain number of UEs (e.g. 100 UEs, 200 UEs … etc. Attach them at reasonable stack speed
33 one by one, 10 UEs per second for example.) repeatedly performs UE state transitions (transition RRC_IDLE to/from
34 RRC_CONNECTED) for a certain period of time (X seconds). The number of UE calls is increased by a fixed number
35 (e.g. 100 UEs). If a finer resolution is required, it is increased by 10 UEs. The number of UE calls in the X seconds
36 interval is incrementally increased, and the test is performed until a fail to connect occurs as shown in the example of

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 198
O-RAN.TIFG.E2E-Test.0-v04.00

1 the right end of Figure 8-2. By connecting multiple UEs at the same time, the basic eNB/gNB control plane processing
2 capability is verified. This test is valid with either LTE or 5G NSA/SA.
3 The test procedure is to gradually increase the number of UEs connected to the eNB/gNB and check the maximum
4 number of simultaneously connected UEs. In order to transition the RRC_CONNECTED state and the RRC_IDLE state
5 of the UE alternately and continuously, the UE needs to transmit a minimum size of U-Plane packet periodically such as
6 ping. The transmit interval of the packets should be longer than RRC inactivity timer value in eNB/gNB. By repeating
7 state transitions by many UEs, test a limit of eNB/gNB processing capacity. In addition, by repeating X seconds the call
8 generation and call release, confirm that there are no problems such as processing delays and memory release leaks
9 shall be confirmed.
10

UE s

500

400

300

200

100

T im e

X Sec X Sec X Sec X Sec


11

12 Figure 8-2 Benchmark of UE State Transition

13 8.2.2 Test setup and configuration

14 The test setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with UEs
15 (emulated) placed under excellent radio conditions as defined in Section 3.6, using LTE RSRP (for LTE) or 5G SS-
16 RSRP (for 5G NSA/SA) as the metric.

17 Since this test scenario requires a large number of UE connections, using an equipment that emulates a large number of
18 UEs and performing the test is recommended.

19 This test is primarily C-Plane (i.e. RRC and RLC) performance benchmarking and SUT performance should be limited
20 by O-DU and O-CU performance, not O-RU. If the O-DU and O-CU are not fully loaded to expectations using the
21 configured RF cell and associated UEs, the optional OFH and/or F1 interface and test equipment as described in Section
22 3.1.1 should be used in addition to the configured RF cell and UEs to add more UE traffic directly at the interfaces to O-
23 DU and O-CU comprising the SUT.

24 Test configuration: The test configuration is not specified. The utilized test configuration (connection diagram between
25 SUT and test equipment, parameters) should be recorded in the test report.
26 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator inserted
27 between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using a UE emulator.
28 The test environment should be setup to achieve excellent radio conditions (LTE RSRP (for LTE) or 5G SS-RSRP (for
29 5G NSA/SA) as defined in Section 3.6) for the UE, but the minimum coupling loss (see Section 3.6) should not be
30 exceeded.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 199
O-RAN.TIFG.E2E-Test.0-v04.00

1 8.2.3 Test Procedure

2 The test steps below are applicable for either LTE or 5G NSA/SA:

3 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
4 report. The serving cell under test is activated and unloaded. All other cells are powered off.
5 2. The UEs (emulated UE) are placed under excellent radio conditions (Cell centre close to radiated eNB/gNB
6 Antenna) as defined by LTE RSRP (for LTE) or 5G SS-RSRP (for 5G NSA/SA) in Section 3.6.
7 3. The End-to-end setup shall be operational for LTE or 5G NSA/SA as applicable for the test scenario, and there
8 should not be any connectivity issues.
9 4. Required performance data (incl. the signalling and control data) as specified in the “Test requirements” Section
10 below is measured and captured at the UE(s) and eNB/gNB side using logging/measurement tools.
11 5. “Power ON” the UEs (emulated UE) to attach to the LTE or 5G NSA/SA cell one by one at reasonable stack speed,
12 10 UEs per second for example. Wait for a successful attach. Each UEs state transition can be checked by the state
13 of receiving control plane messages.
14 6. Ensure that all UEs perform state transitions at regular intervals for X seconds (at least 3minutes) and verify that
15 they are processed correctly. All UEs are then "Power off". It should be checked whether there are any UEs that
16 have failed to make the RRC state transition at the eNB/gNB.
17 7. Check the number of UEs that can be processed per unit time. Increase the number of UEs and repeat step 5-6 in
18 the above until processes such as state transition is no longer working properly to identify the upper limit of
19 processing capacity. The number of UE calls is increased by a fixed number (100 UEs). If a finer resolution is
20 required, it is increased by 10 UEs.
21 8. Check the maximum measured RRC Connected UEs for the SUT in step 5-7 and if the measured value is not
22 consistent with expectations for the entire SUT (O-DU/O-CU) being fully loaded, use the optional OFH and/or F1
23 interface and test equipment (see Section 3.1.1) to add more UEs and RRC Connected state transitions until the
24 entire SUT is fully loaded (e.g. >80%) during execution of the test case procedure.
25 9. Stop and save the test logs. Check the log to make sure that the test runs successfully and that no unexpected
26 behavior such as unexpected call release is recorded. The logs should be captured and kept for test result reference
27 and measurements.

28 8.2.4 Test requirements (expected results)

29 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
30 should be captured and reported in the test report for performance assessment.

31 • Radio parameters such as RSRP, RSRQ


32 • KPIs mentioned in Table 8-3
33 • SUT load/capacity/performance related KPIs (e.g. CPU and MEMORY utilization) if any

34 Validate the successful procedures from the collected logs. Check the rate of UE state transition that can be processed per
35 unit time. For the UE(s) in which call loss has occurred in a X seconds test more than a certain percentage (e.g. 2%), the
36 validity of the cause of the call loss shall be confirmed.

37 Table 8-3 Maximum Rate of UE State Transition Benchmark

Maximum Rate of UE State Transition Number of UEs


(per second)
LTE
NR(NSA)
NR(SA)
38

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 200
O-RAN.TIFG.E2E-Test.0-v04.00

1 8.3 Traffic Load Testing

2 8.3.1 Test description and applicability

3 The purpose of this test is to check the stability of the eNB/gNB under load, with a large number of UEs sending and
4 receiving user data. By connecting a large number of UEs at the same time, the eNB/gNB processing capacity and
5 stability are verified. The maximum cell throughput is discussed in Chapter 5 and is not a scope of this Chapter. This
6 test is valid with either LTE or 5G SA.

7 The test procedure is to connect X UEs (e.g. 10UEs) per second to the eNB/gNB with UDP download/upload traffic
8 until reach N UEs (e.g. 1,000UEs) connected. Then disconnect X UEs per second and connected X new UEs to the
9 eNB/gNB with UDP download/upload traffic. It is to continue disconnecting and connecting UEs near the maximum
10 number of UEs to verify a stability of eNB/gNB. It is recommended that the total UDP traffic be less than the
11 maximum cell throughput, as maximum cell throughput is out of scope for this test.

12 In this test, the number of UEs is assumed to be large, which can reach 1000. For the values of X and N, the test results
13 in the previous Section 8.1 and 8.2 may be helpful.

14 8.3.2 Test setup and configuration

15 The test setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with UEs
16 (emulated) placed under excellent radio conditions as defined in Section 3.6, using LTE RSRP (for LTE) or 5G SS-
17 RSRP (for 5G SA) as the metric.

18 Since this test scenario requires a large number of UE connections, using an equipment that emulates a large number of
19 UEs and performing the test is recommended.

20 The performance of the SUT 3GPP stack components in all of the O-RU, O-DU, and O-CU sub-nodes should be
21 measured while all sub-nodes are under high load. To achieve this it may be recommended to run an additional test
22 execution step adding additional traffic utilizing the optional OFH and/or F1 interface and test equipment as described
23 in Section 3.1.1 in addition to the traffic applied over the configured RF cell.

24 Test configuration: The test configuration is not specified. The utilized test configuration (connection diagram between
25 SUT and test equipment, parameters) should be recorded in the test report.
26 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator inserted
27 between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using a UE emulator.
28 The test environment should be setup to achieve excellent radio conditions (LTE RSRP (for LTE) or 5G SS-RSRP (for
29 5G SA) as defined in Section 3.6) for the UE, but the minimum coupling loss (see Section 3.6) should not be exceeded.
30

31 8.3.3 Test Procedure

32 The test steps below are applicable for either LTE or 5G SA:

33 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
34 report. The serving cell under test is activated and unloaded. All other cells are powered off.
35 2. The UEs (emulated UE) are placed under excellent radio conditions (Cell centre close to radiated eNB/gNB
36 Antenna) as defined by LTE RSRP (for LTE) or 5G SS-RSRP (for 5G SA) in Section 3.6.
37 3. The End-to-end setup shall be operational for LTE or 5G SA as applicable for the test scenario, and there should
38 not be any connectivity issues.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 201
O-RAN.TIFG.E2E-Test.0-v04.00

1 4. Required performance data (incl. the signalling and control data) as specified in the “Test requirements” Section
2 below is measured and captured at the UE(s) and eNB/gNB side using logging/measurement tools.
3 5. Keep X UEs per second to access the eNB/gNB by signalling, initiate UDP upload/download for each UE after UEs
4 get access. In this step and later steps, lost connections shall be re-established automatically to maintain number of
5 RRC Connected UEs.
6 6. Continue Step 5 until total UEs reach maximum active number of UEs N per cell.
7 7. Release X UEs per second and keep X UEs per second to access the eNB/gNB and initiate UDP upload/download
8 for each new access UE.
9 8. Repeat Step 7 for at least 5 minutes and record the KPI values and RRC access success rate.
10 9. Release all UEs, keep X UEs per second to access the eNB/gNB and initiate UDP upload/download for each new
11 access UE until total UEs reach maximum active number of UEs N per cell. Keep for at least 5 minutes testing.
12 Record KPI values and packet error rate.
13 10.Stop and save the test logs. Check the log to make sure that the test runs successfully and that no unexpected
14 behavior such as unexpected call release is recorded. The logs should be captured and kept for test result reference
15 and measurements.
16 11.If any of measured RLC, RRC, and PDCP KPIs are not consistent with expectations for a highly loaded O-DU and
17 O-CU, repeat steps 4 through 10 with similar incremental traffic being applied using the optional OFH and/or F1
18 interface and test equipment (see Section 3.1.1) in addition to the traffic described in steps 5 through 9 being
19 applied at the RF interface.

20

21 8.3.4 Test requirements (expected results)

22 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
23 should be captured and reported in the test report for performance assessment.

24 • Radio parameters such as RSRP, RSRQ


25 • N (maximum active number of UEs per cell), X (UEs per second)
26 • KPIs mentioned in Table 8-4 and Table 8-5
27 • SUT load/capacity/performance related KPIs (e.g. CPU and MEMORY utilization) if any

28
29 Validate the successful procedures from the collected logs. Check the rate of UE state transition that can be processed per
30 unit time. For the UE(s) in which call loss has occurred, the validity of the cause of the call loss shall be confirmed.

31 Table 8-4 RRC Access Success Rate

RRC Access Success Rate


LTE
NR(SA)
32

33 Table 8-5 Packet Error Rate

Packet Error Rate


Uplink Downlink
LTE
NR(SA)
34

35

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 202
O-RAN.TIFG.E2E-Test.0-v04.00

1 8.4 Traffic Model Testing

2 8.4.1 Test description and applicability

3 The purpose of this test is to check the performance under varying load for the eNB/gNB by generating various UE
4 behavior such as Attach/Detach, Registration/Deregistration, Data upload/download, and Mobility based on the traffic
5 model to load the eNB/gNB. Traffic models can vary depending on factors such as geographic characteristics of the
6 target area, UE type, number of UEs, UE mobility, time of day, and so on. It is important to consider the maximum
7 number of UEs and the behavior that it is expected in the field, and develop the appropriate traffic model. This test is
8 valid for either LTE or 5G NSA/SA. The test procedure is to cause various UE behavior such as Attach/Detach,
9 Registration/Deregistration, Data upload/download, and Mobility to the eNB/gNB. Repeat various scenarios based on
10 the traffic model. By processing complex combinations of scenario, the processing capacity of the eNB/gNB is
11 confirmed. Review the KPIs for each scenario to see if there are any issues that depend on a particular scenario.
12

13 8.4.1.1 Traffic model

14 The traffic model is a model of the set of scenarios that an eNB/gNB is expected to undergo in the actual field. The test
15 items in Chapter 4 and 6 can be part of the traffic model scenario as they are or by simplifying them. Each scenario may
16 be repeated continuously at a certain frequency, or many may be executed simultaneously. The pattern of possible
17 scenarios may be differs depending on the characteristics of deployment area in which the eNB/gNB is deployed and
18 can be categorized into several typical types. It is necessary to have eNB/gNB capabilities that are appropriate for the
19 deployment area. See Table 8.4-1 for examples of typical traffic model types.

20 Table 8-6 Example of typical traffic model types

No. Traffic Model Type Major Characteristics

1 Indoor Hotspot ž The number of UEs may be very large, not much fluctuation in
the number of UEs (less handover).
ž High traffic volume in data transmission and reception.
2 Dense Urban ž The number of UEs is very large.
ž High traffic volume in data transmission and reception.
ž High in mobility.
ž Less active at night
3 Office area ž The number of UEs is large.
ž High traffic volume in data transmission and reception.
ž High in mobility. Slower speed.
ž Less active at night
4 Residential area ž The number of UEs is small.
ž Data traffic volume is low during daytime, may peaks at night in
area where Wi-Fi is not widespread.
ž Low in mobility
ž Somehow active during daytime
5 Rural ž The number of UEs is very small.
ž Low traffic volume in data.
ž Low in mobility.
6 High Speed ž The number of UEs may be temporarily concentrated in large
numbers. Burst when a train arrives or departs, or when a train
(Train Station. passes by.
Along the railroad ž As a train passes, many UEs pass at high speed at the same time.
tracks.)

21

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 203
O-RAN.TIFG.E2E-Test.0-v04.00

1 The scenarios included in each traffic model type and the frequency (per UE per hour) should be appropriately
2 determined considering the characteristics of that traffic model type.
3 In this testing, one or more traffic models can be used to evaluate the capacity, stability and load durability of the
4 eNB/gNB.
5
6 Table 8-7 shows an example of traffic model. Traffic model defines a large number of UEs and scenarios, and this
7 model can present traffic observed in real field.
8

9 Table 8-7 Example of Traffic Model

Total number of UEs in a cell 1,000


RAT (The numbers are just example)
No Scenario (baseline) LTE 5G 5G Number Attempts Description
NSA SA of UEs (per UE
per hour)

1 Attach 〇 〇 N/A 50 0.50 Power ON


2 Detach 〇 〇 N/A 50 0.50 Power OFF
3 Tracking area update 〇 〇 N/A 100 1.00
4 Registration N/A N/A 〇 50 0.50 Power ON
5 Deregistration N/A N/A 〇 50 0.50 Power OFF
6 Paging 〇 〇 〇 200 1.00 Total of paging in this
model.
7 Data Stationary 〇 〇 〇 100 1.00 One RRC_CONNECTED
period is defined as "1
upload/download With 5 0.50 attempt". At least
Mobility *) RRC_CONNECTED
period and avg. DL/UL bit
rate should be specified. If
mobility is involved, one
hand over per attempt
assumed, mobility type and
when to move should be
specified.
Additional scenario if any
8 Web Browsing Stationary 〇 〇 〇 120 2.00 At least hold time,
RRC_CONNECTED
With 〇 〇 〇 6 0.50 period/interval/number of
Mobility *) times per call, and avg.
DL/UL Mbytes per
RRC_CONNECTED
period should be specified.
If mobility is involved, one
hand over per attempt
assumed, mobility type and
when to move should be
specified.
9 File Stationary 〇 〇 〇 30 1.50 At least file size,
upload/download and hold
upload/download With 〇 〇 〇 2 0.50 time should be specified. If
Mobility *) mobility is involved, one
hand over per attempt
assumed, mobility type and
when to move should be
specified.
10 Voice Service Stationary 〇 〇 〇 80 0.20 50% of calls Mobile
Terminated, 50% of calls
With 〇 〇 〇 4 0.50 Mobile Originated
Mobility *) assumed. At least avg.
DL/UL bit rate and avg.
hold time should be
specified. If mobility is
involved, one hand over per

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 204
O-RAN.TIFG.E2E-Test.0-v04.00

attempt assumed, mobility


type and when to move
should be specified.
11 Video Streaming Stationary 〇 〇 〇 40 1.00 At least avg. DL/UL bit rate
and avg. hold time should
With 〇 〇 〇 2 0.50 be specified. With mobility,
Mobility *) one handover per one call
attempt. If mobility is
involved, one hand over per
attempt assumed, mobility
type and when to move
should be specified.
1 *) e.g. Intra-cell handover. Other mobilities can be included if multiple cells are configured.
2
3 The performance of the SUT 3GPP stack components in all of the O-RU, O-DU, and O-CU sub-nodes should be
4 measured while all sub-nodes are under high load. To achieve this it may be recommended to run an additional test
5 execution step adding additional traffic utilizing the optional OFH and/or F1 interface and test equipment as described
6 in Section 3.1.1 in addition to the traffic applied over the configured RF cell.

7
8

9 8.4.2 Test setup and configuration

10 The test setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with UEs
11 (emulated) placed under excellent radio conditions as defined in Section 3.6, using LTE RSRP (for LTE) or 5G SS-
12 RSRP (for 5G NSA/SA) as the metric. When inter-eNB/gNB mobility is included in the traffic model, the target/source
13 cells and UE behavior/sequence should be simulated by the UE emulator and Core emulator without using additional
14 cells.

15 Since this test scenario requires a large number of UE connections, using an equipment that emulates a large number of
16 UEs and performing the test is recommended.

17 Test configuration: The test configuration is not specified. The utilized test configuration (connection diagram between
18 SUT and test equipment, parameters) should be recorded in the test report.
19 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator/fading
20 generator inserted between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using
21 a UE emulator. The test environment should be setup to achieve excellent radio conditions (LTE RSRP (for LTE) or 5G
22 SS-RSRP (for 5G NSA/SA) as defined in Section 3.6) for the UE, but the minimum coupling loss (see Section 3.6) should
23 not be exceeded.

24 8.4.3 Test Procedure

25 The test steps below are applicable for either LTE or 5G NSA and SA:

26 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
27 report. The serving cell under test is activated and unloaded. All other cells are powered off.
28 2. The UEs (emulated UE) are placed under excellent radio conditions (Cell centre close to radiated eNB/gNB
29 Antenna) as defined by LTE RSRP (for LTE) or 5G SS-RSRP (for 5G NSA/SA) in Section 3.6.
30 3. The End-to-end setup shall be operational for LTE or 5G NSA/SA as applicable for the test scenario, and there
31 should not be any connectivity issues.
32 4. Required performance data (incl. the signalling and control data) as specified in the “Test requirements” Section
33 below is measured and captured at the UE(s) and eNB/gNB side using logging/measurement tools.
34 5. Start the operation of each UE according to the traffic model.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 205
O-RAN.TIFG.E2E-Test.0-v04.00

1 6. Check KPIs for each scenario included in the traffic model during continuous operation (several tens of minutes to
2 several hours).
3 7. Stop and save the test logs. Check the log to make sure that the test runs successfully and that no unexpected
4 behavior such as unexpected call release is recorded. The logs should be captured and kept for test result reference
5 and measurements.
6 8. If measured RLC, RRC, and and/or PDCP KPIs are not consistent with expectations for a highly loaded O-DU and
7 O-CU, repeat steps 4 through 7 with similar incremental traffic being applied using the optional OFH and/or F1
8 interface and test equipment (see Section 3.1.1) as was used for Section 8.3.3 (Traffic Load Testing) in addition to
9 the modelled traffic applied at the RF interface.

10 8.4.4 Test requirements (expected results)

11 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
12 should be captured and reported in the test report for performance assessment.

13 • Radio parameters such as RSRP, RSRQ


14 • Traffic model
15 • KPIs mentioned in Table 8-8
16 • SUT load/capacity/performance related KPIs (e.g. CPU and MEMORY utilization) if any

17 If a scenario in the Traffic Model is the same as the scenario described in Chapter 4 and 6, it is recommended to select
18 and evaluate some of the corresponding KPIs described in Chapter 4 and 6.
19
20 Validate the successful procedures from the collected logs. It is desirable that the test duration shall be more than several
21 tens of minutes to several hours. Check the completion rate of each scenario and the adequacy of the completion rate for
22 each scenario shall be confirmed. For the UE(s) in which call loss has occurred, the validity of the cause of the call loss
23 shall be confirmed.

24 Table 8-8 KPIs in traffic model scenario (example)

RAT Success Optional KPI if any


Rate (%)
No. Scenario(baseline) LTE 5G 5G SA
NSA
1 Attach 〇 〇 N/A e.g. SgNB addition
Success rate in NSA
2 Detach 〇 〇 N/A
3 Tracking area update 〇 〇 N/A
4 Registration N/A N/A 〇
5 Deregistration N/A N/A 〇
6 Paging 〇 〇 〇
7 Data Stationary 〇 〇 〇 e.g. Avg. UL/DL
upload/download throughput
With Mobility e.g. Avg. UL/DL
throughput
Additional scenario if any
8 Web Browsing Stationary 〇 〇 〇 e.g. Avg. UL/DL
throughput
With Mobility 〇 〇 〇 e.g. Avg. UL/DL
throughput
9 File Stationary 〇 〇 〇 e.g. Avg. UL/DL
upload/download throughput
With Mobility 〇 〇 〇 e.g. Avg. UL/DL
throughput

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 206
O-RAN.TIFG.E2E-Test.0-v04.00

10 Voice Service Stationary 〇 〇 〇 e.g. Packet loss rate


With Mobility 〇 〇 〇 e.g. Packet loss rate
11 Video Streaming Stationary 〇 〇 〇 e.g. Avg. UL/DL
throughput
With Mobility 〇 〇 〇 e.g. Avg. UL/DL
throughput
1
2

3 8.5 Long hours stability Testing

4 8.5.1 Test description and applicability


5 The purpose of this test is to conduct a load test using the traffic model described in Section 8.4.1.1 for a long period
6 (e.g. 24 to 48 hours) to confirm the continuous operation for a long period of time. This test is valid with either LTE or
7 5G NSA and SA. The test procedure is to cause various UE behavior such as Attach/Detach,
8 Registration/Deregistration, and Mobility based on the traffic model to the eNB/gNB, and maintains its performance for
9 a long time. Verify the KPIs for each scenario by repeating call generation and call release for various traffic.

10 8.5.2 Test setup and configuration

11 The test setup is a single RF cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with
12 UEs(emulated) placed under excellent radio conditions as defined in Section 3.6, using LTE RSRP (for LTE) or 5G SS-
13 RSRP (for 5G NSA/SA) as the metric.

14 Since this test scenario requires a large number of UE connections, using an equipment that emulates a large number of
15 UEs and performing the test is recommended.

16 The extended length performance and stability of the SUT 3GPP stack components in all of the O-RU, O-DU, and O-
17 CU sub-nodes should be measured while all sub-nodes are under high load. To achieve this it may be recommended to
18 apply additional traffic utilizing the optional OFH and/or F1 interface and test equipment as described in Section 3.1.1
19 in addition to the traffic applied at the configured RF cell.

20

21 Test configuration: The test configuration is not specified. The utilized test configuration (connection diagram between
22 SUT and test equipment, parameters) should be recorded in the test report.
23 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator inserted
24 between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using a UE emulator.
25 The test environment should be setup to achieve excellent radio conditions (LTE RSRP (for LTE) or 5G SS-RSRP (for
26 5G NSA/SA) as defined in Section 3.6) for the UE, but the minimum coupling loss (see Section 3.6) should not be
27 exceeded.

28 8.5.3 Test Procedure


29 The test steps below are applicable for either LTE or 5G NSA and SA:

30 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
31 report. The serving cell under test is activated and unloaded. All other RF cells are powered off.
32 2. If during execution of Section 8.3.3 (Traffic Load Testing) or Section 8.4.3 (Traffic Model Testing) incremental
33 traffic was applied using the optional OFH and/or F1 interface and test equipment (see Section 3.1.1) in addition to

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 207
O-RAN.TIFG.E2E-Test.0-v04.00

1 the traffic being applied at the RF interface, the same level of traffic should be applied again using the optional
2 OFH and/or F1 interface and test equipment to ensure O-DU and O-CU exposed to adequate “base load” before
3 execution of Step 6.
4 3. The UE (emulated UE) is placed under excellent radio conditions (Cell centre close to radiated eNB/gNB Antenna)
5 as defined by LTE RSRP (for LTE) or 5G SS-RSRP (for 5G NSA/SA) in Section 3.6.
6 4. The End-to-end setup shall be operational for LTE or 5G NSA/SA as applicable for the test scenario, and there
7 should not be any connectivity issues.
8 5. Required performance data (incl. the signalling and control data) as specified in the “Test requirements” Section
9 below is measured and captured at the UE(s) and eNB/gNB side using logging/measurement tools (optional).
10 6. Start the operation of each UE according to the traffic model. Keep the operation for a long period of several tens of
11 hours.
12 7. Monitor KPIs for each scenario included in the traffic model during this test. A scenario is considered successful if
13 the KPIs for each scenario is equivalent to the traffic model testing case. if not, it is a failure.
14 8. Stop and save the test logs. Check the log to make sure that the test runs successfully and that no unexpected
15 behavior such as unexpected call release is recorded. The logs should be captured and kept for test result reference
16 and measurements (optional).
17

18 8.5.4 Test requirements (expected results)


19 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
20 should be captured and reported in the test report for performance assessment.

21 • Radio parameters such as RSRP, RSRQ


22 • Traffic Model
23 • KPIs mentioned in Table 8-9, Table 8-8.
24 • SUT load/capacity/performance related KPIs (e.g. CPU and MEMORY utilization) if any

25 Validate the successful procedures from the collected logs. Normally this test should be a minimum of 12 hours with 60
26 hours as a recommended maximum. Check the completion rate of each scenario and the adequacy of the completion rate
27 for each scenario shall be confirmed. For the UE(s) in which call loss has occurred, the validity of the cause of the call
28 loss shall be confirmed.

29 Table 8-9 Successfully tested hours

Successfully tested hours (hours)


LTE
NR(NSA)
NR(SA)
30

31 8.6 Multi-cell Testing

32 8.6.1 Test description and applicability

33 The purpose of this test is to test the eNB/gNB's processing capacity by applying a load to some or all of its cells
34 simultaneously based on the traffic model. Inter-cell interference is out of scope, so the DUT cells should be configured
35 there is no inter-cell interference. This test is valid with either LTE or 5G NSA and SA. The test procedure is to cause
36 various UE behavior such as Attach/Detach, Registration/Deregistration, and Mobility based on the Traffic Model for
37 some or all of the Cells in the eNB/gNB. Verify the KPIs for each scenario in each cell by repeating call generation and

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 208
O-RAN.TIFG.E2E-Test.0-v04.00

1 call release for various traffic (scenarios). This test is a multi-cell test of Section 8.4 and is intended to increase the load
2 on eNB/gNB. In this testing, various mobilities can be included among DUT cells.
3
4 The performance of the SUT 3GPP stack components in all of the O-RU, O-DU, and O-CU sub-nodes should be
5 measured while all sub-nodes are under high load. To achieve this it may be recommended to run an additional test
6 execution step adding additional traffic utilizing the optional OFH and/or F1 interface and test equipment as described
7 in Section 3.1.1 in addition to the traffic applied over the two or more configured RF cells.

8
9

10 8.6.2 Test setup and configuration

11 The test setup is multiple cells scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with UEs
12 (emulated) placed under excellent radio conditions as defined in Section 3.6, using LTE RSRP (for LTE) or 5G SS-
13 RSRP (for 5G NSA/SA) as the metric.

14 Since this test scenario requires a large number of UE connections, using an equipment that emulates a large number of
15 UEs and performing the test is recommended.

16 Test configuration: The test configuration is not specified. The utilized test configuration (connection diagram between
17 SUT and test equipment, parameters) should be recorded in the test report.
18 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator inserted
19 between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using a UE emulator.
20 The test environment should be setup to achieve excellent radio conditions (LTE RSRP (for LTE) or 5G SS-RSRP (for
21 5G NSA/SA) as defined in Section 3.6) for the UE, but the minimum coupling loss (see Section 3.6) should not be
22 exceeded.

23 8.6.3 Test Procedure

24 The test steps below are applicable for either LTE or 5G NSA and SA:

25 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
26 report. The target cells under test are activated and unloaded. All other cells are powered off.
27 2. The UE (emulated UE) is placed under excellent radio conditions (Cell centre close to radiated eNB/gNB Antenna)
28 as defined by LTE RSRP (for LTE) or 5G SS-RSRP (for 5G NSA/SA) in Section 3.6.
29 3. The End-to-end setup shall be operational for LTE or 5G NSA/SA as applicable for the test scenario, and there
30 should not be any connectivity issues.
31 4. Required performance data (incl. the signalling and control data) as specified in the “Test requirements” Section
32 below is measured and captured at the UE(s) and eNB/gNB side using logging/measurement tools.
33 5. Start the operation of each UE according to the traffic model for all of target cells.
34 6. Check KPIs for each scenario included in the traffic model. A scenario is considered successful if the KPIs for each
35 scenario is equivalent to the single-cell case. if not, it is a failure.
36 7. Stop and save the test logs. Check the log to make sure that the test runs successfully and that no unexpected
37 behavior such as unexpected call release is recorded. The logs should be captured and kept for test result reference
38 and measurements.
39 8. If measured RLC, RRC, and and/or PDCP KPIs are not consistent with expectations for a highly loaded O-DU and
40 O-CU, repeat steps 4 through 7 with similar incremental traffic being applied using the optional OFH and/or F1
41 interface and test equipment (see Section 3.1.1) as was used for Section 8.3.3 (Traffic Load Testing) or 8.4.3
42 (Traffic Model Testing) in addition to the modelled traffic applied at the RF interface

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 209
O-RAN.TIFG.E2E-Test.0-v04.00

1 8.6.4 Test requirements (expected results)

2 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
3 should be captured and reported in the test report for performance assessment.

4 • Radio parameters such as RSRP, RSRQ


5 • Traffic Model
6 • KPIs mentioned in Table 8-10, Table 8-8
7 • SUT load/capacity/performance related KPIs (e.g. CPU and MEMORY utilization) if any

8 Validate the successful procedures from the collected logs. The number of cells to be communicated is preferably 2 cells
9 or more. Check the KPIs of each scenario and the adequacy of the completion rate for each scenario shall be confirmed.
10 For the UE(s) in which call loss has occurred, the validity of the cause of the call loss shall be confirmed.

11 Table 8-10 Successfully tested number of Cells

Successfully tested number of Cells


LTE
NR(NSA)
NR(SA)
12

13 8.7 Emergency call

14 8.7.1 Test description and applicability

15 The purpose of this test is to ensure that emergency calls, such as 911/112, are preferentially established even if the
16 eNB/gNB is overloaded. This test is valid with either LTE or 5G NSA and SA. The test procedure is to further increase
17 the heavy traffic model to load the eNB/gNB, creating an overload condition where some behavior of the UE, such as
18 data upload/download, fails. Verify that emergency calls, such as 911/112, is able to communicate correctly in such a
19 situation.
20
21 The Emergency Call performance of the SUT 3GPP stack components in all of the O-RU, O-DU, and O-CU sub-nodes
22 should be measured while all sub-nodes are under high load. If additional traffic utilizing the optional OFH and/or F1
23 interface and test equipment as described in Section 3.1.1 was applied in Section 8.3.3 (Traffic Load Testing) or Section
24 8.4.3 (Traffic Model Testing) in addition to the traffic applied at the configured RF cell, the same additional traffic
25 should be applied for the Emergency Call test case.

26

27 8.7.2 Test setup and configuration

28 The test setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with UEs
29 (real or emulated) placed under excellent radio conditions as defined in Section 3.6, using LTE RSRP (for LTE) or 5G
30 SS-RSRP (for 5G NSA/SA) as the metric.

31 Since this test scenario requires a large amount of non-emergency traffics to overload SUT, so it is recommended to run
32 the test with equipment that emulates a large amount of non-emergency traffics.

33 Test configuration: The test configuration is not specified. The utilized test configuration (connection diagram between
34 SUT and test equipment, parameters) should be recorded in the test report.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 210
O-RAN.TIFG.E2E-Test.0-v04.00

1 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator inserted
2 between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using a UE emulator.
3 The test environment should be setup to achieve excellent radio conditions (LTE RSRP (for LTE) or 5G SS-RSRP (for
4 5G NSA/SA) as defined in Section 3.6) for the UE, but the minimum coupling loss (see Section 3.6) should not be
5 exceeded.

6 8.7.3 Test Procedure


7 The test steps below are applicable for either LTE or 5G NSA and SA:

8 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
9 report. The serving cell under test is activated and unloaded. All other cells are powered off.
10 2. The UE (real or emulated UE) is placed under excellent radio conditions (Cell centre close to radiated eNB/gNB
11 Antenna) as defined by LTE RSRP (for LTE) or 5G SS-RSRP (for 5G NSA/SA) in Section 3.6.
12 3. The End-to-end setup shall be operational for LTE or 5G NSA/SA as applicable for the test scenario, and there
13 should not be any connectivity issues.
14 4. Required performance data (incl. the signalling and control data) as specified in the “Test requirements” Section
15 below is measured and captured at the UE(s) and eNB/gNB side using logging/measurement tools.
16 5. If during execution of Section 8.3.3 (Traffic Load Testing) or 8.4.3 (Traffic Model Testing) incremental traffic
17 was applied using the optional OFH and/or F1 interface and test equipment (see Section 3.1.1) in addition to the
18 traffic being applied at the RF interface, the same level of traffic should be applied again using the optional OFH
19 and/or F1 interface and test equipment to ensure O-DU and O-CU exposed to adequate “base load” before
20 executing Step 6.
21 6. Start the operation of each UE according to the traffic model.
22 7. Review the KPIs for each scenario in the traffic model to ensure that the KPIs are degraded due to overload. If there
23 is no degradation of the KPI, increase the traffic model until there is degradation of the KPI (until it becomes
24 overloaded).
25 8. Make multiple emergency calls (100 times suggested) such as 911/112 to confirm if the call can be established.
26 9. Stop and save the test logs. Check the log to make sure that the test runs successfully and that no unexpected
27 behavior such as unexpected call release is recorded. The logs should be captured and kept for test result reference
28 and measurements.

29 8.7.4 Test requirements (expected results)

30 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
31 should be captured and reported in the test report for performance assessment.

32 • Radio parameters such as RSRP, RSRQ


33 • Traffic Model and related KPIs
34 • KPIs mentioned in Table 8-11
35 • SUT load/capacity/performance related KPIs (e.g. CPU and MEMORY utilization) if any

36 Validate the successful procedures from the collected logs. Confirm the emergency call communication.

37 Table 8-11 Emergency Call Success Rate

Emergency Call Success Rate


LTE
NR(NSA)
NR(SA)
38

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 211
O-RAN.TIFG.E2E-Test.0-v04.00

1 8.8 ETWS (Earthquake and Tsunami Warning System)

2 8.8.1 Test description and applicability

3 The purpose of this test is to verify that the ETWS notification is successful even if the eNB/gNB is overloaded. This
4 test is valid with either LTE or 5G NSA and SA. The test procedure is to further increase the heavy traffic model to load
5 the eNB/gNB, creating an overload condition where some behavior of the UE, such as data upload/download, fails.
6 Verify that the ETWS Primary and Secondary notifications can communicate correctly under these conditions.
7 The ETWS performance of the SUT 3GPP stack components in all of the O-RU, O-DU, and O-CU sub-nodes should be
8 measured while all sub-nodes are under high load. If additional traffic utilizing the optional OFH and/or F1 interface
9 and test equipment as described in Section 3.1.1 was applied in Section 8.3.3 (Traffic Load Testing) or 8.4.3 (Traffic
10 Model Testing) in addition to the traffic applied at the configured RF cell, the same additional traffic should be applied
11 for the ETWS test case.

12

13 8.8.2 Test setup and configuration

14 The test setup is a single cell scenario (i.e. isolated cell without any inter-cell interference – see Section 3.7) with UEs
15 (real or emulated) placed under excellent radio conditions as defined in Section 3.6, using LTE RSRP (for LTE) or 5G
16 SS-RSRP (for 5G NSA/SA) as the metric.

17 Since this test scenario requires a large amount of non-emergency traffics to overload SUT, so it is recommended to run
18 the test with equipment that emulates a large amount of non-emergency traffics.

19 Test configuration: The test configuration is not specified. The utilized test configuration (connection diagram between
20 SUT and test equipment, parameters) should be recorded in the test report.
21 Laboratory setup: The radio conditions experienced by the UE can be modified using a variable attenuator inserted
22 between the antenna connectors (if available) of the O-RU and the UE, or appropriately emulated using a UE emulator.
23 The test environment should be setup to achieve excellent radio conditions (LTE RSRP (for LTE) or 5G SS-RSRP (for
24 5G NSA/SA) as defined in Section 3.6) for the UE, but the minimum coupling loss (see Section 3.6) should not be
25 exceeded.

26 8.8.3 Test Procedure

27 The test steps below are applicable for either LTE or 5G NSA and SA:

28 1. The test setup is configured according to the test configuration. The test configuration should be recorded in the test
29 report. The serving cell under test is activated and unloaded. All other cells are powered off.
30 2. The UE (real or emulated UE) is placed under excellent radio conditions (Cell centre close to radiated eNB/gNB
31 Antenna) as defined by LTE RSRP (for LTE) or 5G SS-RSRP (for 5G NSA/SA) in Section 3.6.
32 3. The End-to-end setup shall be operational for LTE or 5G NSA/SA as applicable for the test scenario, and there
33 should not be any connectivity issues.
34 4. Required performance data (incl. the signalling and control data) as specified in the “Test requirements” Section
35 below is measured and captured at the UE(s) and eNB/gNB side using logging/measurement tools.
36 5. If during execution of Section 8.3.3 (Traffic Load Testing) or 8.4.3 (Traffic Model Testing) incremental traffic
37 was applied using the optional OFH and/or F1 interface and test equipment (see Section 3.1.1) in addition to the
38 traffic being applied at the RF interface, the same level of traffic should be applied again using the optional OFH
39 and/or F1 interface and test equipment to ensure O-DU and O-CU exposed to adequate “base load” before
40 executing Step 6.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 212
O-RAN.TIFG.E2E-Test.0-v04.00

1 6. Start the operation of each UE according to the traffic model.


2 7. Review the KPIs for each scenario in the traffic model to ensure that the KPIs are degraded due to overload. If there
3 is no degradation of the KPI, increase the traffic model until there is degradation of the KPI (until it becomes
4 overloaded).
5 8. Confirm the ETWS Primary and Secondary notifications communication multiple times(100 times suggested).
6 9. Stop and save the test logs. Check the log to make sure that the test runs successfully and that no unexpected
7 behavior such as unexpected call release is recorded. The logs should be captured and kept for test result reference
8 and measurements.

9 8.8.4 Test requirements (expected results)

10 In addition to the common minimum set of configuration parameters (see Section 3.3), the following metrics and counters
11 should be captured and reported in the test report for performance assessment.

12 • Radio parameters such as RSRP, RSRQ


13 • Traffic Model and related KPIs
14 • KPIs mentioned in Table 8-12
15 • SUT load/capacity/performance related KPIs (e.g. CPU and MEMORY utilization) if any

16 Validate the successful procedures from the collected logs. Make sure that the ETWS function works correctly and
17 confirm that the ETWS notification is sent to the UE.

18 Table 8-12 ETWS Notification Success Rate

ETWS Notification Success Rate


Primary Secondary
LTE
NR
19

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 213
O-RAN.TIFG.E2E-Test.0-v04.00

1 RIC-Enabled End-to-end Use Case Test


2
3 This chapter describes tests for evaluating and assessing the performance of a radio access network with Non-RT RIC
4 and Near-RT RIC optimisations from a network end-to-end perspective. The focus of the testing is either on the end-
5 user performance with user specific optimisations by the Non-RT RIC and Near-RT RIC or on system level performance
6 with system specific optimisations.

7 The test setup and test methodology applicable to all the RIC-enabled end-to-end use case tests is described in Section
8 9.1, followed by the various test cases including their test description and procedures for each use case. At the
9 beginning of each use case section, system level, cell level and UE level test configurations and KPIs involved in
10 defining the test scenario and measuring the performance related to the use case testing are discussed. Following it,
11 exact parameter values and KPIs to be measured are defined on a per test case basis.

12 The test applies to SA (Standalone) mode operation only in this version of the test specification.

13 The following RIC-enabled use case tests are defined in this chapter

14 • Traffic steering with connected mode mobility control

15 Future versions of this specification may add additional tests, for example:

16 • Traffic steering with idle mode mobility control


17 • QoS/QoE optimisation
18 • RAN slice SLA assurance
19 • Massive MIMO Beamforming Optimisation

20 Please see below summary table of test cases and applicable technology.

21 Table 9-1: RIC-Enabled End-to-end Use Case Tests summary

Applicable technology

NSA
LTE

SA
Test case Functional group

TEST
RIC-Enabled End-to-end Use Case
ID
Traffic Steering
9.2 Traffic steering with connected mode mobility control N/A N/A Y
Use Case

22
23

24 9.1 Test Setup and Methodology


25 The overall test setup is shown in Figure 9-1. The SUT in the RIC-enabled end-to-end use case test is comprised of the
26 O-DU(s), O-CU-CP/O-CU-UP, and the Non-RT RIC and Near-RT RIC functions. The O-RU is not recommended to
27 be part of the SUT, since it does not contribute materially to the outcome of these tests, and several O-RUs would be
28 needed to sufficiently extract the performance benefit of the RIC for each of the use cases. The test needs to evaluate
29 UE and system performance for mixed types of service, e.g., voice service, video service and data service etc. A 5G
30 core will be required to support the basic functionality to authenticate and register an end user device in order to setup a
31 PDN connection/PDU session. An IMS core will be required to register the end user device to support voice or video
32 services on a 5G network. The 5G and IMS cores may be a completely emulated, partially emulated or a real non-
33 emulated core. The application server(s) support for different types of data services is defined in Section 6.1.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 214
O-RAN.TIFG.E2E-Test.0-v04.00

1 The O-RUs and end user devices (UEs) should be emulated in these tests since the system performance of the various
2 use cases need to be evaluated with a sufficient number of cells and UEs, and under different radio conditions, load
3 conditions and mobility patterns (e.g., at least 10 cells and 100 UEs per cell located at different positions in the network
4 and moving along different trajectories at different speeds) to generate statistics for performance analysis. In addition,
5 with emulated O-RUs and UEs, traffic can be applied directly at the Open Fronthaul interface which is more cost and
6 form-factor efficient due to the large number of cells and UEs that are required.

7 All the elements in the network, including the O-RAN system, 5G core and the application servers need to have the
8 ability to capture traces to validate the successful execution of the test cases. The end user device needs to have the
9 capability to capture traces/packets to calculate the KPIs related to the RIC enabled use cases.

SUT
SMO Non-Real-Time RIC
Test Equipment
SUT (actual or simulated)
O1 A1

Near-Real-Time RIC

E2 E2 E2 Application
Server
F1 gNB-CU- gNB-CU-
gNB-DU Ng 5G Core
CP UP

IMS Core
O-FH
O-RU

UE
10

11 Figure 9-1 Illustration of test setup for RIC enabled use case E2E testing

12 The purpose of these tests are to validate the O-RAN system’s E2E performance for all the various RIC-enabled use
13 cases in different network deployment scenarios [36-38]. In order to achieve this, multiple sets of test configuration
14 parameters are defined for system level, cell level and UE level configuration. In keeping with the spirit of the end-to-
15 end test focus on the O-RAN system as SUT, the detailed ORAN protocol conformance on the various O-RAN internal
16 interfaces a specified in [36-38] will not be verified, and instead, system-level performance according to the objectives
17 of the use case is the primary test focus.

18 There are no passing criteria defined for these performance-focused tests. This is due to the diversity of vendor-
19 specific RRM and MAC layer scheduler implementations in the O-RAN system which also depend on specific network
20 deployment scenarios, making it difficult to specify minimum performance requirements.

21 The performance of the system with Non-RT RIC and Near-RT optimisations should be benchmarked through analysis
22 of KPI measurement results. The performance gain with Non-RT RIC and Near-RT optimisation is evaluated by
23 comparing the test result with RIC optimisation enabled versus that with it disabled. 2

24 9.2 Traffic Steering Use Case Test


25 The purpose of this test is to validate the O-RAN system’s E2E functionalities and evaluate system performance for the
26 Traffic Steering (TS) use case. The O-RAN traffic steering use case specification can be found in Section 3.5 of [36],
27 with the use case requirements from a non-RT RIC standpoint specified in Section 3.1 of [37] and near-RT RIC
28 standpoint in Section 3.1 of [38].

29 The traffic steering optimization can be achieved through connected mode mobility control or idle mode mobility
30 control through the Near-RT RIC [38], Section 3.1. In this version of the test specification, only the connected mode
31 mobility control is addressed while idle mode mobility control is FFS.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 215
O-RAN.TIFG.E2E-Test.0-v04.00

1 9.2.1 Test Configuration

2 9.1.1.1 System Level Configuration

3 The following system configuration parameters should be configured and recorded in the test report:

4 • Node configuration: mapping of cells to O-RUs, mapping of O-RUs to O-DUs and O-DUs to O-CU-CP/UP,
5 and configuration of basic RAN parameters over O1.
6 • Traffic Steering mode: TS mode 0, TS mode 1 or TS mode 2 [38]
7 • KPI optimisation policies: e.g., bit rate guarantee, QoS guarantee, maximum cell throughput and maximum
8 system throughput etc.

9 9.1.1.2 Cell Level Configuration

10

… Band 1

Band 2

Band 3
… Band 4

11
12 Figure 9-2: Illustration of cell topologies

13 In SA, the test setup consists of multiple 5G cells, each one associated with a certain O-DU and O-CU connected to a
14 5G core network. In order evaluate system performance, the example parameters listed in Table 9-1 can be configured
15 for different cell deployment scenarios. The network may have macro cells as a coverage layer and small cells at
16 different locations for capacity boosting. Each cell site can provide service access on multiple frequency bands.

17 Table 9-1 Example cell topology configurations

Item No. Categories Parameters

1 Macro Cell deployment pattern Inter-site distance


Cell radius
Cell location
Frequency bands
2 Small Cell deployment pattern Inter-site distance
Cell location
Cell radius
Frequency bands
3 Inter cell interference Section 3.7
5 Deployment Environment Urban Macro
Urban Micro
Rural Macro
Indoor

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 216
O-RAN.TIFG.E2E-Test.0-v04.00

6 Antenna Tx power, antenna height, tilt, azimuth,


radiation pattern and number of antenna
elements
7 Network configurations Carrier frequency
Duplexing mode
Bandwidth
Numerology
1

2 9.2.1.1 UE Level Configuration

3 The test environment shall have multiple UEs, with active traffic, moving across multiple cells. Locations where the radio
4 conditions are excellent, good, fair and poor need to be configured for each cell. UEs should be placed into these different
5 radio conditions.

6 The UEs can be distributed in the geographical distribution scenarios as defined below (see Figure 5-3):

7 c) uniform distribution: UEs are placed with a spatial separation randomly selected following a uniform
8 probability distribution.

9 d) non-uniform distribution: the UEs are placed with a spatial separation randomly selected following a certain
10 distribution such that the UEs tend to be grouped in clusters. UE density at the small cell hotspot area may be
11 higher than UE density in wider macro cell coverage area. The UEs in the cluster experience similar radio
12 conditions.

13 As illustrated in Figure 9-4, UE mobility trajectory can be a predefined deterministic path or may be based on random
14 walk model following certain geographical distribution profiles (uniform or non-uniform distribution) defined above.
15 UEs of different speed categories can be configured in the test with the speed profile as defined in Section 3.5. UE
16 mobility speed may increase and decrease from time to time around the typical value specified for each speed category
17 within an interval defined by the minimum and maximum speed configuration. The location distribution, mobility
18 trajectory, mobility class configurations should be recorded in the test report along with the KPI measurement results
19 defined in Section 9.2.1.4 for a group of UEs of interest that is monitored for performance evaluation.

20

21

22

23

24
25 (a) (b)

26 Figure 9-3: Illustration of UE distribution pattern (a) uniform (b) non-uniform

27

28

29

30

31 (a) (b)

32 Figure 9-4: Illustration of UE trajectory pattern (a) deterministic route (b) random walk

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 217
O-RAN.TIFG.E2E-Test.0-v04.00

1 The following service-related parameters may be configured on a per UE per QoS flow basis as illustrated by the
2 examples shown in Table 9-2 for the group of UEs of interest that is monitored for performance evaluation. Another
3 group of UEs can be configured to apply background traffic load to each cell to create low, medium, and high load
4 conditions for different service types as required. The traffic load condition for both groups can vary with time around
5 the typical value within an interval defined by the minimum and maximum load variation configuration. The service-
6 related configurations including cell load conditions should be recorded in the test report.
7 • User traffic type
8 • Traffic target load
9 • 5QI
10 • S-NSSAI
11 Table 9-2: Example of UE service configuration for three UEs [37]

UE Identifier UE Id=1, S-NSSAI =1 UE Id=2, S-NSSAI =2 UE Id=3, S-NSSAI =3


User Traffic 5QI=1: Voice 5QI=1: Voice 5QI=1: Voice
5QI=8: FTP, Email 5QI=8: Email 5QI=8: Progressive Video
5QI=83: Advanced Driving 5QI=8: File sharing

12 The test is advised to be repeated using various mobility and traffic conditions to sufficiently evaluate the E2E system
13 performance with Non-RT RIC and Near-RT RIC optimization based on AI/ML technology, compensating for
14 performance uncertainties of AI/ML models.

15 9.2.1.2 Expected Result and KPIs

16 The values of the system level KPIs and end user KPIs for each test run both with and without Traffic-Steering enabled
17 should be included in the test report along with the test configuration parameters as defined Section 9.2.1 to provide a
18 comprehensive view of the test setup. The test results are recorded at a constant time interval that is short enough to
19 identify features of interest in the test analysis where the value of the time interval should be set appropriately for the
20 UE mobility class configured.

21 There are no passing criteria defined for this test due to the diversity of vendor-specific RRM and MAC layer scheduler
22 implementations in the O-RAN system. The performance of the system with Non-RT RIC and Near-RT optimisations
23 should be benchmarked through analysis of KPI measurement results. The performance gain with Non-RT RIC and
24 Near-RT optimisation is evaluated by comparing the test result with RIC optimisation enabled versus that with it
25 disabled. 3

26 The following statistical KPI measurements should be included in the test report with time stamps:

27 Cell level KPIs (for cells in performance monitoring group):

28 • Distribution/average of DL total PRB usage per cell


29 • Distribution/average of UL total PRB usage per cell
30 • Distribution/average of DL throughput in gNB per cell per QoS class
31 • Distribution/average of UL throughput in gNB per cell per QoS class
32 • Distribution/average of DL delay between NG-RAN and UE per cell per QoS class
33 • Distribution/average of UL delay between NG-RAN and UE per cell per QoS class
34 • Distribution/average of delay DL air-interface per cell per QoS class
35
36 End user KPIs (for UEs in performance monitoring group):

37 • Distribution/average of DL Total PRB Usage per UE


38 • Distribution/average of UL Total PRB Usage per UE
39 • Distribution/average of DL UE throughput in gNB per UE per QoS class

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 218
O-RAN.TIFG.E2E-Test.0-v04.00

1 • Distribution/average of UL UE throughput in gNB per UE per QoS class


2 • Distribution/average of DL delay between NG-RAN and UE per UE per QoS class
3 • Distribution/average of UL delay between NG-RAN and UE per UE per QoS class
4 • Distribution/average of delay DL air-interface per UE per QoS class
5 • Distribution/average of RSRP, RSRQ, PDSCH SINR per UE
6 • Wideband CQI and MCS distribution in PDSCH per UE
7 • Wideband CQI and MCS distribution in PUSCH per UE
8
9 Optionally, the following instantaneous KPI measurements can be attached to the report (as appropriate) along with
10 timestamps and UE geolocation information to provide further details in the test analysis.
11
12 Cell level KPIs (for cells in performance monitoring group):

13 • DL total PRB usage per cell


14 • UL total PRB usage per cell
15 • DL throughput in gNB per cell per QoS class
16 • UL throughput in gNB per cell per QoS class
17 • DL delay between NG-RAN and UE per cell per QoS class
18 • UL delay between NG-RAN and UE per cell per QoS class
19 • Delay DL air-interface per cell per QoS class
20
21 End user KPIs (for UEs in performance monitoring group):

22 • DL Total PRB Usage per UE


23 • UL Total PRB Usage per UE
24 • DL UE throughput in gNB per UE per QoS class (IP)
25 • UL UE throughput in gNB per UE per QoS class (IP)
26 • DL delay between NG-RAN and UE per UE per QoS class
27 • UL delay between NG-RAN and UE per UE per QoS class
28 • Delay DL air-interface per UE per QoS class
29 • RSRP, RSRQ, PDSCH SINR per UE
30
31 The corresponding definitions of the KPIs above can be found in 3GPP TS 28.552 [35], Section 5.1.
32
33 KPIs related to service Quality-of-Experience (QoE):
34 • Call drop rate
35 • Relative Jitter for voice traffic4
36 • DL RTP Error Rate for voice traffic5
37 • UL RTP Error Rate for voice traffic67
38 • MOS value for voice traffic
39 • MOS value for video traffic
40 • FTP download/upload throughput
41 • Average DL/UL IP throughput
42 • Average DL/UL IP latency
43

44 9.2.2 RIC-TS1 - Traffic Steering (Connected Mode Mobility) Test Case

45 9.2.2.1 Test Description and Applicability

46 The purpose of this test is to validate the O-RAN system’s E2E performance for the Traffic Steering use case in a
47 minimum viable network deployment scenario with two macro cells operating at low-band and mid-band respectively.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 219
O-RAN.TIFG.E2E-Test.0-v04.00

1 In the test 3 groups of UEs with different mobility classes (high, low and stationary) and running different services are
2 configured. UEs in the first group run Voice (5QI=1) service and UEs in the second and third UE groups run MBB
3 (5QI=9) service. All UEs stay in RRC connected state. The Non-RT and Near RT RIC are informed of the requirements
4 and characteristics of the services as well as the mobility class of each UE. The RIC entities aim to steer the traffic to
5 the most appropriate cells through Connected Mode mobility control to the guarantee QoS requirements for the service
6 of each UE, maximize service continuity, and meanwhile maximize the overall system throughput for MBB service.
7
8 The test applies to SA (Standalone) mode operation only in this version of the test specification.

9 9.2.2.2 System Level Configuration

10 The following system configuration parameters should be configured and recorded in the test report:

11 Node Configuration (O-RU, O-DU and O-CU-CP/UP)


12 The node configuration of the O-RAN system (O-RU, O-DU and O-CU-CP/O-CU-UP) is not specified. The utilized
13 test configuration (parameters) should be recorded in the test report.
14
15 Traffic Steering Mode
16 Optimization in Near-RT RIC can be done with A1 policy guidance from Non-RT RIC (TS Mode 2) or without the
17 guidance (TS mode 1) [38]. The performance gain with TS Mode 1 or 2 optimization can be compared against TS Mode
18 0 where Non-RT RIC and Near-RT RIC assisted optimisation is disabled. The exact TS Mode configured for the test is
19 not specified. The utilized configuration (parameters) should be recorded in the test report.
20
21 KPI Optimization Policy
22 Guarantee QoS requirement for the service of each UE as defined in [41], Table 5.7.4-1, maximize service continuity,
23 meanwhile maximize the system throughput and service experience for MBB (5QI=9) service.

24 9.1.1.3 Cell Level Configuration

Cell A

Cell B

25
26 Figure 9-5: Illustration of cell topologies

27 As illustrated in Figure 9-5, the UEs are in an area covered by two cells on different frequency bands identified as Cell A
28 and Cell B respectively. Cell B is the macro cell operating at low band with the best coverage. Cell A is macro cell

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 220
O-RAN.TIFG.E2E-Test.0-v04.00

1 operating at mid-band providing higher capacity than Cell B but less coverage. The configuration parameters for each
2 cell are shown in Table 9-3.
3 Table 9-3 Cell configuration profile

Item No. Categories Parameters Value

1 Macro Cell deployment pattern Inter-site distance NA


Cell radius Cell A: typical urban macro cell range
lower than Cell B, e.g., 2.5km
Cell B: typical urban macro cell range
higher than Cell A, e.g., 5km
Cell position As shown in Figure 9-5
Number of sectors per cell 3
Sector start/stop angle (degree) Sector 1: e.g., 0-119
Sector 2: e.g., 120-239
Sector 3: e.g., 239-359
Frequency bands Cell A: e.g., mid band, n77
Cell B: e.g., low band, n8
2 Small Cell deployment pattern Inter-site distance NA

Cell radius NA
Cell position NA
Number of sectors per cell NA
Sector start/stop angle (degree) NA
Frequency bands NA
3 Inter cell interference Section 3.7 Apply inter cell interference
according to Section 3.7
5 Deployment Environment Urban Macro Cell A, Cell B
Urban Micro NA
Rural Macro NA
Indoor NA
6 Antenna Tx power, antenna height, tilt, Typical configurations meeting the
azimuth, radiation pattern and cell sectors and cell radius
number of antenna elements requirements defined above.

MIMO antenna configurations:


Cell A: e.g., 4TX, 4RX
Cell B: e.g., 2TX, 2RX
7 Network configurations Carrier frequency Not specified
Bandwidth Cell A: e.g., 100 MHz
Cell B: e.g., 20 MHz

Numerology Cell A: e.g.,1


Cell B: e.g.,1
4

5 9.2.2.3 UE Level Configuration

6 In this test case there are 3 groups of UEs defined. Group 1 and 2 contain 10 UEs each that move within the area
7 covered by the two frequency bands in a random walk pattern following a uniform geographical distribution as
8 discussed in Section 9.2.1.3. The mobility class for each group of UEs is defined in Table 9-4. Service-related
9 parameters are configured on a per UE per QoS flow basis for the Group 1 and 2 UEs as defined in Table 9-4. Group 3
10 contains 10 stationary UEs located at the position shown in Figure 9-6 applying background traffic load to each cell.
11 The number of stationary UEs placed in each cell and the MBB traffic throughput of each UE is configured to create the

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 221
O-RAN.TIFG.E2E-Test.0-v04.00

1 low, medium, and high load conditions defined in Table 9-5. The stationary UEs stay on the cells defined without
2 handover to other cells to sustain the defined load conditions.
3

Cell A

Cell B

5 Figure 9-6: Illustration of stationary UE positions

7 Table 9-4 UE configuration profile

UE Identifier Group 1, UE Id=1-10, S- Group 2, UE Id=21-30, S- Group 3, UE Id=41-50, S-


NSSAI=1 NSSAI=1 NSSAI=1
User Traffic 5QI=1: Voice 5QI=9: Video, FTP 5QI=9: FTP
Mobility Class 50% Low mobility 50% Low mobility Stationary
50% High mobility 50% High mobility
Frequency Band low band and mid-band, e.g., n8 low band and mid-band, e.g., n8 low band and mid-band, e.g., n8
and n77 and n77 and n77
Bandwidth DL:20MHz, 100MHz DL:20MHz, 100MHz DL:20MHz, 100MHz
UL:20MHz, 100MHz UL:20MHz, 100MHz UL:20MHz, 100MHz
Max. MIMO Layers DL:4, UL:2 DL:4, UL:2 DL:4, UL:2

8 Table 9-5 Stationary UE service configuration profile

Cell A B
No. of stationary UEs 4 6
Traffic Load Medium, e.g., 50% High, e.g., 90%

10 9.2.2.4 Initial Condition

11 1. The End-to-end setup shall be 5G SA, operational, and there should no connectivity issues.
12 2. The test setup is configured according to the test configuration.

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 222
O-RAN.TIFG.E2E-Test.0-v04.00

1 9.2.2.5 Test Procedure

2 1. Disable traffic steering optimisation by Near-RT RIC and Non-RT RIC (TS Mode 0 [38]).
3 2. Power on the end user devices and ensure they register with the 5G core for data, video, or voice service over SA
4 by connecting over the O-RAN (O-RU, O-DU and O-CU-CP/O-CU-UP).
5 3. Once the registration is complete, the end user devices establish PDU sessions with the 5G core.
6 4. Once the PDU sessions have been setup, the end user devices with voice or video traffic register with the IMS core
7 to support Voice-over-NR or Video-over-NR service.
8 5. Take appropriate action to enable the traffic according to the traffic configuration profile as discussed in Section
9 9.2.2.4.
10 6. Once the traffic has started, move the devices according to the mobility patterns and mobility classes configured
11 (Section 9.2.2.4).
12 7. Observe the policies and commands from RIC over E2, O1 and A1 interfaces, capture traces over the interfaces (as
13 required) for validating successful test execution.
14 8. Keep the test running for 10 minutes, measure and record the KPIs as defined in Section 9.2.2.7.
15 9. Take appropriate action to reset the network and traffic conditions to its initial state at before Step 3.
16 10. Enable the traffic steering optimisation from Near-RT and Non-RT RIC (TS Mode 1 or TS Mode 2 [38]), repeat
17 Step 4-10.
18 11. Analyse the KPI measurement results. Evaluate the performance gain with Non-RT RIC and Near-RT optimisation
19 enabled versus that with it disabled.

20 9.2.2.6 Expected Result

21 The exact optimization behaviors from Non-RT RIC and Near-RT RIC are not specified due to the diversity of vendor-
22 specific RRM and MAC layer scheduler implementations in the O-RAN system making it impossible to specify a
23 standardized behaviour pattern. However, behaviors similar to the example below should be observed from the trace
24 information captured from the emulated UEs.
25 1. For UEs in Group 1, voice traffic (5QI=1) should preferably reside on Cell B to minimize inter-cell handovers
26 for service continuity.
27 2. For UEs in Group 2, the MBB traffic with (5QI=9) should preferably reside on Cell A if the radio and cell load
28 condition on Cell A gives higher throughput and user experience for MBB service than Cell B.
29
30 Serving cell and frequency band change for each UE in UE Group 1 and 2 can be observed from the trace information
31 captured from the emulated UEs in order to confirm the Traffic Steering optimization behaviors from the Non-RT RIC
32 and Near-RT RIC as discussed above.
33
34 The performance of the system with Non-RT RIC and Near-RT optimisation should be benchmarked through analysis
35 of KPI measurement results as discussed in Section 9.2.1.4. The performance gain with Non-RT RIC and Near-RT
36 optimisation is evaluated by comparing the test result with RIC optimisation enabled versus that with it is disabled. The
37 number of UEs meeting the QoS requirement for voice service (5QI=1) and MBB service (5QI=9), count of service
38 drops, and the total system throughput and service experience for MBB service with Non-RT RIC and Near-RT RIC
39 optimisation in Step 11 (Section 9.2.2.6) should outperform the case without the optimisation in Step 1-8 (Section
40 9.2.2.6).

41 The following statistical KPI measurements should be included in the test report:

42 System level KPIs:

43 • Number of UEs meeting the QoS requirement for voices service (5QI=1)
44 • Number of UEs meeting the QoS requirement for MBB service (5QI=9)
45 • Average system DL PDCP throughput
46 • Average system UL PDCP throughput

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 223
O-RAN.TIFG.E2E-Test.0-v04.00

1
2 Cell level KPIs (for cells in performance monitoring group) [35]:

3 • Average DL throughput in gNB per cell


4 • Average UL throughput in gNB per cell
5 • Average DL delay between NG-RAN and UE per cell
6 • Average UL delay between NG-RAN and UE per cell
7 • DL Cell PDCP SDU Data Volume
8 • UL Cell PDCP SDU Data Volume
9
10 End user KPIs (for UEs in performance monitoring group) [35]:

11 • Average DL UE throughput in gNB per UE


12 • Average UL UE throughput in gNB per UE
13 • Average DL delay between NG-RAN and UE per UE
14 • Average UL delay between NG-RAN and UE per UE
15
16 KPIs related to user experience for Voice service:
17 • MOS Score – Mean Opinion Score for the voice call. This needs to be measured on both ends of the Voice call
18 – mobile originated and mobile terminated.
19 • Mute Rate % – This is the percentage of calls which were muted in both directions (calls with RTP loss of > 3-
20 4s in both directions are considered muted call). This needs to be measured on both ends of the Voice call –
21 mobile originated and mobile terminated and counted only once per call.
22 • One Way Calls % – This is the percentage of calls which were muted in any one direction (calls with RTP loss
23 of > 3-4s in one direction only are considered one-way calls). This needs to be monitored on both ends of the
24 Voice call – mobile originated and mobile terminated and counted only once per call.
25 • RTP Packet Loss % - Number of RTP packets which were dropped/lost in uplink/downlink direction as a
26 percentage of total packets. This needs to be measured on both ends of the Voice call – mobile originated and
27 mobile terminated.
28
29 KPIs related to user experience for Video service:
30 • Video start time or Time to load first video frame – Time from when the video was selected to play to when
31 the video starts playing in seconds.
32 • Video MOS score – MOS score for the video streaming session as defined by ITU P1203.3.
33
34 KPIs related to user experience for FTP service:
35 • Download throughput – This is the average application layer throughput to download the file in kbps.
36 • Upload throughput – This is the average application layer throughput to upload the file in kbps.
37 • Time taken to Download file – This is the time taken to download the file in seconds.
38 • Time taken to Upload file – This is the time taken to upload the file in seconds.
39

40
41
42

43

44

45

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 224
O-RAN.TIFG.E2E-Test.0-v04.00

1 Annex A: Template of test report


2 The test report template provided in this annex is intended for general use in reporting E2E test results. This template is
3 not intended for use in obtaining any O-RAN Certificate or O-RAN Badge. For more details on the process and
4 procedures for use in seeking an O-RAN E2E Badge and for the test report templates to be used for this purpose, see
5 [42] and [43].

6 A. GENERAL INFORMATION
A1 Name of test campaign

A2 Version of the report – reference ID A3 Date(s) of testing

A4 Contact person (tester) – incl. Name, Organization, E-mail address

A4 Test location (lab) – incl. the address

A5 Description of test campaign, summary of test results, conclusions

7 List of tests - details of each test can be found in Section E.


Test No. Test name Test status [ PASS / FAIL / - ]
01
02
03

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 225
O-RAN.TIFG.E2E-Test.0-v04.00

1 B. TEST AND MEASUREMENT EQUIPMENT AND TOOLS

# Equipment or tool Type Manufacture Version (HW/SW) Notes#


01
02
03
2 # Specific details such as frequency range, date of last calibration, Type and version of Operating System at end-user device/application server, etc.

4 C. SYSTEM UNDER TEST


C1 Total number of DUTs included in SUT C2 Deployment architecture (indoor/outdoor, micro/pico/macro)

C3 Description of SUT – block diagram – connection scheme

5 DUT 1#
C3 Type C4 Serial Number C5 Supplier (manufacture)

C6 SW version C7 HW version

C8 Interface/IOT profile(s) if applied

C9 Description incl. parameters, setting/configuration

6 # If SUT contains more DUTs, please copy the table.

8 D. TEST CONFIGURATION
D1 Carrier (central) frequency – (NR-)ARFCN D2 Total transmission (effective) bandwidth D3 Number of total Resource Blocks

D4 Sub-carrier spacing D5 Carrier prefix length D6 Slot length

D5 Duplex mode D6 TDD DL/UL ratio (configuration) D7 IPv4 / IPv6

eNB/gNB UE

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 226
O-RAN.TIFG.E2E-Test.0-v04.00

D8 Number of supported MIMO layers at eNB/gNB D9 Number of supported MIMO layers at UE

D10 Antenna configuration - number of Tx/Rx at eNB/gNB D11 Antenna configuration - number of Tx/Rx at UE

D12 Total antenna gain at eNB/gNB D13 Total antenna gain at UE

D14 Total transmit power at antenna connectors at eNB/gNB D15 Total transmit power at antenna connectors at UE

2 E. TEST RESULTS
E1 Test No. E2 Test name

E3 Date(s) of test execution E4 Reference to test specification

E5 Utilized test and measurement equipment and tools, incl. the specific setting/configuration – reference to Section B

E6 Test setup – connection/block diagram – deployment scenario (dense urban/urban/rural, LOS/NLOS/nLOS)

E7 Test procedure – describe differences in comparison with the test procedure defined in test spec. – limitations

E8 Test results – measured KPIs – incl. attachment of raw log file(s)

E9 Notes, incl. observed issues with the solutions

E10 Conclusions – pass/fail – assessment of measured KPIs (in comparison with the expected KPIs) – gap analysis

_______________________________________________________________________________________________.
© 2022 by the O-RAN ALLIANCE e.V. Your use is subject to the copyright statement on the cover page of this specification. 227

You might also like