Footprint

Changing the RF cost performance paradigm

Noise and the RF Infrastructure – (Part 4 of 6)
Coverage and capacity are critical cost performance metrics in wireless systems, with both generally limited by degraded receiver performance from the introduction of unwanted noise. This interference can come from multiple possible sources, but is commonly introduced into the system by deteriorating components or less than perfect RF infrastructure. Introduction Noise is a fact of life in radio communication systems, with some background noise always present. Noise can also emanate from many sources including:• Cable, filters and other passive elements which exhibit a loss. • Intermodulation Distortion (IMD), and • The environment - broad band RF interference from electricity static, ionic clouds or corona discharge, electric appliances and tools, automobiles, lightning and spurious RF etc. The radio receiver will see any interference as unwanted noise, and reducing noise levels or noise rise at the BTS receiver is key to network health. Any internal or external interference is likely to result in some degree of ‘deafening’ in a TDMA receiver or ‘robbing’ of Interference Margin in CDMA systems. The result is reduced coverage, capacity or both, causing: • Dropped Calls. • Subscriber Churn. • Lost Revenue, and • High OPEX It limits the accuracy of the smallest measurements, since detectable signal amplitude must be no less than the noise floor for TDMA systems, or within a tolerable level under it for spread spectrum technology. Receiver noise levels degrade further as evolving technologies utilize wider channel bandwidths. Channel Bandwidth 250 kHz 1.25 MHz 5 MHz Table 1 kTB Thermal 50 ohm Noise Floor -119 dBm -112 dBm -107 dBm

GSM CDMA WCDMA

The specified sensitivity of a BTS receiver in conjunction with expected noise levels determines the geographic layout of the network and the hardware needed to support it. Higher than anticipated noise levels or noise rise challenge planned cost performance models through reduced practical receiver sensitivity, coverage and capacity.
We have previously sighted Holma and Toskala (WCDMA & UTMS Nokia Finland 2004) who suggest 1dB loss in CDMA and UTMS receiver sensitivity can mean as much as 11% loss of coverage. Interference Generated Noise Increased receiver interference adds to the noise floor causing the rejection of the subscriber call that would otherwise be of acceptable quality.

Figure 1 Noise and Network Topology In today’s wireless networks, mobile connectivity is maintained by handing a call from one cell to the next along the subscriber’s route. The point of handover is determined by the subscriber’s detected signal quality, or Bit Error Rate (BER), with the actual acceptable signal and noise level ratios varying between TDMA and CDMA technologies. RF theory suggests the noise floor level is the sum of all the noise sources and unwanted signals within a system.

Figure 2 Figure 2 is the display from a spectrum analyser on “Maximum Hold”, capturing wide band noise resulting from the presence of PIM in the RF interconnection over time. In this case GSM900 and WCDMA800 were colocated with resulting PIM products degrading the receiver

Further information visit our websites www.summitekinstruments.com www.triasx.com

Page 6 - Footprint Four - 1

Footprint
Changing the RF cost performance paradigm
noise floor to as high as -95dBm/30kHz. This loss in receiver sensitivity is loss in coverage and capacity. (Case studies supporting these findings are provided in part 5 of this series of papers.) De-Sensitization, Saturation and Blocking, Signal to Noise Ratio (SNR) is also degraded when large interfering signals saturate a receiver, compromising signal gain and creating distortion. The degradation worsens as the interference increases in power. This effect is often called blocking, saturation or de-sensitization. Every component in the receiver has a non-linear characteristic, and when subjected to saturation, both inband and out-of-band harmonics and intermodulation products can be generated inside the receiver even thought they were not present in the input signal. Uplink Noise Rise in CDMA Systems In the spread spectrum wireless environment uplink noise rise is defined as the relationship between the total noise interference a receiver must contend with in proportion to normal background thermal noise floor. CDMA system noise rise is the total of all prospective interference, remembering that every subscriber session is every other session’s interference. As the session numbers increase, subscriber noise rise naturally increases toward a level setting the maximum capacity of the receiver. Typically RF power control at both ends of the wireless link is used to minimise the session interference and maximise coverage and capacity. A limit is imposed when this rise over thermal exceeds a preset magnitude. eg, 6dB. There are many scenarios that will determine how the natural noise rise will affect the receive system, but once it reaches the preset limit, only lowering the average bit rate and hence interference power density can increase capacity. Subscriber uplink noise rise is a normal function of a healthy spread spectrum receiver, however any unwanted noise in the system will rob capacity and coverage. Working with Noise As discussed above, there are multiple possible noise sources. Removing one does not necessarily eliminate the others but it does narrow the options. Changes in the cellular RF environment to meet demand for more services and lower cost performance, combined with the pressure to mix new technologies with those currently used, introduces a potential IMD nightmare. It is therefore essential the antenna and the supporting infrastructure is built using the same Quality Control (QC) standards as the radio equipment and associated components. (The previous paper in this series titled ‘Working with PIM - part 3 of 6’, introduces PIM, discussing how this can be avoided at the BTS.) Traditionally TDMA technology created some challenges with PIM, but narrow bandwidth channels made working around prospective interference problems easier when there was plenty of available spectrum. PIM products either fell out of the receiver band or only affected some channels. Where they fell out of band, the cause was either ignored, or frequency planning provided a solution. Second order products were never considered to be a problem. Spectrum licensing cost limitations and wider bandwidth technology has forced a review of this practice triggering the need to use every bit of available spectrum. CDMA technology with its 1.25 MHz bandwidth requirement was more challenging from the start. With the pre-selection filters found in narrow band systems unable to be used, IMD is a very likely cause of noise rise as broadband PIM products potentially enter the receiver directly, or as a component producing receiver IM by mixing with local oscillator components. Second order products are also more likely to be a problem. The best and most cost effective way to avoid receive noise rise due to any IMD derivative is to eliminate or significantly reduce its presence. This means correcting the source. Testing for PIM at the BTS is proving to be the most effective way to confirm this has been achieved. Summary Utilizing a PIM test as an RF infrastructure performance metric is now being used or reviewed by wireless network operators around the world. Working together, Summitek Instruments and Triasx are addressing the growing global interest in Interconnection Quality Analysis (iQA), and have created an Interconnection Technology Centre (iTC) to provide industry support. IMD is often the root cause of noise rise, and as the RF spectral density at the BTS increases, the possibility of noise being present in the form of Passive Intermodulation (PIM) increases. This is a direct result of non-linearity in the RF system caused by poor quality components or physical construction.

Further information visit our websites www.summitekinstruments.com www.triasx.com

Page 7 - Footprint Four - 2