You are on page 1of 29

Mobile Network QoS Management

From Network Centric Operations to


Experience & Service Centric
Operations

Expresso Senegal
March-2018
Introduction: Quality of Services Challenges for Operators

• Maximize the utilization of offered services.

• Increase the revenues.

• Stay better than competitors.

• Develop and experience new scalable services.

• Optimize the network performance.

• Ensure and improve the Return On Investment (ROI).


Traditional QoS Management: Network centric Operations

Performance
2G 3G 4G
Monitoring
(OSS KPI) Radio KPIs analysis and Control

Coverage and Interference Analysis U2000


Drive Testing
and Post Mobility and Intersystem Interoperability Analysis
Processing Probe/TEMS

Network Integrity and Capacity Auditing


Analysis and
Network data Pilot Pollution Control CSFB Performance
Updating
Layer 3 analysis for failed events (set-up, Drop…) network or UE based
Customers Cluster optimization and replanning
Complaints
Follow-up Low Throughputs causes analysis

Functions Upgrade & follow-up


Performance Monitoring and root cause follow-up: KPI based

Calls – 8750 Calls – 9720


Drop Call Rate – 0.8% Drop Call Rate – 2.3% Calls – 10300
Calls – 7500 Calls – 6150
Data Drop – 1.8% Data Drop – 3.6% Drop Call Rate – 1.2%
Drop Call Rate – 1.5% Drop Call Rate – 0.75%
Data Drop – 1.3% Data Drop – 1.2% Data Drop – 0.8%

Calls – 8500 Calls – 6700


Calls – 5400 Drop Call Rate – 0.5% Drop Call Rate – 3.3%
Calls – 11200 Drop Call Rate – 3.5% Data Drop – 1.1% Data Drop – 2.8%
Drop Call Rate – 1.6% Data Drop – 4.1%
Data Drop – 0.3%

Focus of
Network Calls – 8500 Calls – 10800
Calls – 7350 Calls – 9500
Engineer Drop Call Rate – 1.7% Drop Call Rate – 1.2%
Drop Call Rate – 1.3% Drop Call Rate – 1.2%
Data Drop – 1.4% Data Drop – 1.33% Data Drop – 1.6%
Data Drop – 1.5%

> 2% Drop Threshold


Set by SLA
PM and root cause follow-up: Enhanced KPI based, TopN

Calls – 8750 Calls – 9720


Drop Call Rate – 0.8% Drop Call Rate – 2.3% Calls – 10300
Calls – 7500 Calls – 6150
Data Drop – 1.8% Data Drop – 3.6% Drop Call Rate – 1.2%
Drop Call Rate – 1.5% Drop Call Rate – 0.75%
Data Drop – 1.3% Data Drop – 1.2% Data Drop – 0.8%

Calls – 8500 Calls – 6700


Calls – 5400 Drop Call Rate – 0.5% Drop Call Rate – 3.3%
Calls – 11200 Drop Call Rate – 3.5% Data Drop – 1.1% Data Drop – 2.8%
Drop Call Rate – 1.6% Data Drop – 4.1%
Data Drop – 0.3%

Focus of
Network Calls – 8500 Calls – 10800
Calls – 7350 Calls – 9500
Engineer Drop Call Rate – 1.7% Drop Call Rate – 1.2%
Drop Call Rate – 1.3% Drop Call Rate – 1.2%
Data Drop – 1.4% Data Drop – 1.33% Data Drop – 1.6%
Data Drop – 1.5%

TopN TopN Cells/sites


with Highest CDR
Performance Monitoring and root cause follow-up: Alarm based
Start

Fault Detection Escalation


Fault happens & Confirmation: Analysis by Front
NSM or FM Office
Front Office

No
TT assigned to Yes
Solved Confirmation
FO from FO
No
Yes
End

TT assigned to Yes
Solved
Back Office BO
No
Escalade to
third Level

Close Trouble
FO / BO / FM
Ticket

Field No
maintenance TT assigned to
Solved
FM
Yes
Drive Testing and Post Processing: Coverage Issue in 3G

Dive Test Coverage


Measurement Problem Area

Best Server’s No Coverage Area


CPICH > -90 dBm
Optimization
Yes
 Antenna Tilting/ High change
No Best Server’s  Pilot Power increase
Ec/Io > - 12 dB  Hardware Change
 Missing Neighbors analysis

Yes  Pilot power of Neighbors

Dominance Area 4th Best Server’s No Dominance Area analysis


Ec/Io < 6 dB
Optimization Optimization
Yes

Complete
Dominance Problem area Measurement Dominance Problem area
(Low best server Ec/No) Analysis (High best server Ec/No)
Analysis and Change Implementations

Collection of
• Cell Users Complaints
• cluster
• Complain
• Voice call Radio
• Throughput Ranking Performance
audit Mutual
Optimization Plan

RF Parameters Yes
Worst
Audit Core
Cluster Cluster Network
Optimization
End to End No
Traffic Analysis Transport
Audit Report
Post Optimization
RAN Drive Test
Physical Sites
Parameters Audit

Is the Cluster
Competitive Optimized
Benchmark Audit
Enhanced QoS Management: Experience & Service
Centric Operations

With the development of mobile networks, customer needs and behaviors have
changed. Mobile communications means so much more than simple voice
communication; there is now mobile Internet with web surfing, video phone,
streaming media, and microblogging. Focusing on traditional KPIs only are no
longer adequate for measuring the quality of mobile services. The objective of
network optimization has gradually shifted from enhancing network performance
to improving quality of experience (QoE). Therefore, assessing and optimizing QoE
is the trend for optimizing today mobile networks.
Traditional Evaluation Method KPI ≠ End-users’ Experience

 Individual user experience is flooded in average KPI  Element KPI ≠ E2E service success rate

Every domain is excellent


Drop Call Rate
99%
0.0846% (excellent)
99%
Cell 129
40% 99%

The real situation (cell 129) 99%


 4 people dropped call >= 2 times
in 3 hours
40%: End-user service
 2 people dropped all the calls
successful rate

 Lack of subjective service quality monitoring Example Times


KPI Value Compliants (GSM
e.g. MOS(DT test) can evaluate end user’s accurate /a month)
experience directly, but just sample measurement. Many user’s
Vague audio 288
may have bad experience, such as noise, echo…still out of MOS(DT 3.6 Example
monitoring. test) (Excellent) One-way audio/noise/echo/… 412

How can we define and manage the end-users’ experience?


Experience & Service Operations View
Device
Ipad Performance
Operational How many individual
Use Cases subscribers made the calls

How many are


Corporate/Mass, what is
their CEI Iphone Samsung
galaxy S3
Calls – 7500 Which are devices that are
Drop Call Rate – 1.5% Customer Service Mix & being used
Data Drop – 1.8% Centric View & OTT Service
Proactive Action Management

Data & voice


Who are impacted by: Call Drop/
Call Failure/ Data Access Failure
Product /Data Drop /Low Throughput
Planning &
Campaign
Hot Spots & Network voice
HVC Focus
Video Call
Is anyone Investment &
Operations Optimization Which are services
experiencing failure
that are failing
Assessment of a set of QoE KPIs(Technical)

Excellent/Good/Fair/Poor/bad
QoE

Coverage/ Accessibility/ Retainability/ Integrity


Sub QoE

Streaming Service Setup Success Rate


Streaming Service Setup Delay
KQI Streaming Service Download Completion Rate

RRC Connection Setup Success Rate


TBF Setup Success Rate Attach Success Rate
PDP context Activation Success Rate
KQI Streaming Server Connection Success Rate

RAN/CN/VS Performance data


GSM WCDMA CDMA/ Wi-Fi DT/CQT Data
CDR/MR
Data Data 4G Signaling Collection Data

07/07/2017 Think, work and act as one team to meet our customers’ dreams 12
What is the correlation between QoS and QoE ?
• Technical Correlation
 Possible through convergent indicators
• Perception Correlation
 Difficult but possible
• Methodology
 Identify QoS indicators/parameters, technically measurable and influencing the QoE
 Use intrusive measures (with a reference)
 Evaluate both indicators and quality perceived by the user (QoE)
 Establish formulas for calculating QoE through QoS indicators
 Apply methods to non-intrusive measures
• Advantages
 Estimate the QoE based on network indicators without the need for intrusive measures
• Limits
 Requires adaptation to the effects external to the user (regulator)
07/07/2017 Think, work and act as one team to meet our customers’ dreams 13
Factors (aspects) Affecting the QoE

• Technical
 The specific expectation of the user in terms of quality indicators
 Technologies’ trend
 Users’ equipment

• Users perception
 The present particular context of the user (need, mood, physical state ...)
 Living standard (price)
 The social environment (culture, intellectual level, customs, ...)
 User experience in the network or other networks
 Fashion and trends
 Advertising offers/products (including those of competitors)

07/07/2017 Think, work and act as one team to meet our customers’ dreams 14
Exemple of Parameters
Voice quality

Parameters Causes Impact on QoE

Call Establishment Failure Various causes ( Radio or core network) customer dissatisfaction

Echo Transmission issue Poor listening

Distortion Interferences Hearing difficulties

Background noise Wrong network configuration Hearing difficulties


Equipment out of date (EOM, EOS)
Bad interconnection of network elements

Call drops Various causes ( Radio or core network) customer dissatisfaction

07/07/2017 Think, work and act as one team to meet our customers’ dreams 15
Exemple of Parameters
Data quality
Parameter Cause Effect Impact on QoE

Packets loss Transmission error Packets retransmission Navigation delay


Low download speed

Latency ( end- Queuing Delay on real time Impossibility for online gaming
to-end delay) Congestion application Lag of image for video
surveillance

Jig Different root Packets arrive on wrong Navigation delay


Algorithm different from order Low download speed
network nodes Waiting reordering

07/07/2017 Think, work and act as one team to meet our customers’ dreams 16
Exemple of Parameters
Video quality

Parameter Cause Effect Impact on QoE


Time of first slow image Bad Buffer Sizing Wide waiting before the Wide waiting before the
appearance image starting image starting
Availability of the video Server congestion Flow of video non Video stop
Slow transmission link continuous Video shift
Resolution and Frame Bad dimensioning of the inadequate transmission Delays, stopping video
rate (Image per second) resolution vs user bit rate channel to route video

Codec used Bad compression, so more inadequate transmission Delays, stopping video
data channel to route video

Poor delivery of video Wrong sizing/configuration Poor flow Cut-off/ stopping the
of application reader synchronization video

07/07/2017 Think, work and act as one team to meet our customers’ dreams 17
Converting from KPI to QOE
User User QoE
Experience Experience
on Marketing Perception
on Management
QoEn=g( KQI1,KQI2.. … KQIn)=gf( KPI1,KPI2.. … KPIn)
QoE
Service Data services KQIs Voice services KQIs
Foundation Defined by Service type: Additional Data
Experience •Voice quality: accessibility, call drop, Quality
on KQI speech quality
•WEB quality: accessibility, web delay,
download speed KQIn=f( KPI1,KPI2.. … KPIn)
Foundation Network •SMS quality: accessibility, access Network
KPI delay Performance
•… Additional Data

Radio KPIs Bearer network KPIs Core network / VAS KPIs


Defined by Network domain:
•Wireless network performance •Transmission network performance
•Core network performance •Service system performance IP
•IP network performance •… Network
Node B MSC/ MSC/
RNC MGW MGW RNC Node B

 Establish QoE Evaluation System  Focus more on KQI than KPI


 Set up QOE, KQI, KPI mapping relationships  Transformate from Objective KPIs to Subjective QoE
Mapping
Category KQI name Weight QOE score QoE grade User perception
KQI value QoE score
≥ 99% 5
≥ 4.5 Excellent Very satisfied
95%~99% 4
first page loading
80% ~ 95% 3 40% 4~4.5 Very good Satisfied
success ratio
70% ~ 80% 2
< 70% 1 3.5~4 Good Little unsatisfied
Accessibility
≥ 512kbps 5
3~3.5 Fair Much unsatisfied
256~512kbps 4
average first page
128~256kbps 3 10% <3 Poor Most unsatisfied
loading speed
64~128kbps 2
< 64kbps 1
≤ 0.1% 5
QoE = 4 * 40% + 3*10% + 4*10% + 3*25% +
Retainability
web browsing data
transfer cut-off ratio
0.1% ~ 0.5%
0.5% ~ 1%
1% ~ 2%
4
3
2
10% 4*15% = 3.65 ,Good
> 2% 1
≥ 99% 5
95%~99% 4 • Initialized with default weight value & benchmark for KQIs ,
web page refreshing
80% ~ 95% 3 25%
success ratio yet need to get agreement with operator
70% ~ 80% 2
< 70% 1 • Automatically generate optimization report and send out
Integrity ≥ 512kbps 5
256~512kbps 4 alarm information immediately for poor QoE status
average page refreshing
128~256kbps 3 15%
speed
64~128kbps 2
< 64kbps 1
Estimate the global QoE
Example of weighting by Indicator
• Identify consumer preferences (surveys):
Voice QoS QoE SMS QoS QoE
Coverage 30% 80.% 24.00%
Originating Successful 40% 97% 38.80%
Accessibility 25% 97.18% 24.30%
Terminating Successful 40% 95% 38.00%
Retainability 25% 98.58% 24.65%
Reception duration 20% 90% 18.00%
speech quality 20% 90.87% 18.17%
94.80%
91.11%

Video Data (DT) QoS QoE


(streaming) QoS (DT) QoE
Data Connection establishment 30% 95% 28.50%
Time of video starting 40% 85% 34.00%
email reception successful 20% 98% 19.60%
Intermittent Stop of Video 30% 90% 27.00% email sending successful 20% 93% 18.60%
Video image quality 20% 96% 19.20% continuous http navigation 15% 95% 14.25%
Voice quality of the video 10% 98% 9.80% download speed 15% 85% 12.75%
90.00% 93.70%

• Establish a weighting grid by service to obtain an assessment of the client's feel (QoE)
07/07/2017 Think, work and act as one team to meet our customers’ dreams 20
Field Measurements Template
KQI_1 KQI_2 KQI_3 KQI_4
Information Accessibility Reliability Transparency
Each level have around 3 KQI, and
Sales approach Agencies sales others
each of the KQI will be evaluated
Products and offers Product Offers Promotion subjectively with attached template
After Sales Accessibility Support Waiting Time Follow-up

Internet
Network Coverage Voice Call SMS
Mobile

KQI_1 KQI_2 KQI_3 KQI_4


Excellent
KQI: Key Quality Indicator    
Bonne
   
Acceptable
   
Mauvais
   
Médiocre
   
07/07/2017 Think, work and act as one team to meet our customers’ dreams 21
QoE Monitoring Tools

 Application layer tools


 E.g. Ping, FTP, HTTP browsing, MMS, SIP, WAP, etc…

 Field measurement tools


 Radio measurements + application layer performance

 Protocols analyzers
 Protocol stack performance analysis at any interface

 Mobile QoS Agents


 L1-L7 measurements, position and location
 Active and passive measurements

07/07/2017 Think, work and act as one team to meet our customers’ dreams 22
Mobile Quality Agent (MQA)

 Measuring mobile multimedia service quality, radio parameters, and


producing and reporting performance statistics to central management
servers

 Active probing and/or passive monitoring, which turns thousands of


commercial mobile phones into (secure and non intrusive) service quality
probing stations

 A central management server derives KPIs out of the reports from QoS
agents, and manages QoS agents, i.e. dynamically dispatches, installs,
and activates or deactivates them

07/07/2017 Think, work and act as one team to meet our customers’ dreams 23
Mobile Quality Agent: Nemo CEM
Mobile Quality Agent: Nemo CEM
Device Performance Analysis

This documents presents results of the Nemo Customer


Experience Monitor trial perform by Expresso Senegal between
January 1st 2018 and February 20th 2018.
During that period, following number of samples have been
collected:
Device Performance Analysis
Device Performance Analysis
Best Technology available

Failure

Voice Call Success


Think, work and act as one team to meet our customers’ dreams

07/07/2017 29

You might also like