You are on page 1of 15

INVITED PAPER, UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO

1

Performance Evaluation of Cognitive Radios:
Metrics, Utility Functions and Methodologies
Youping Zhao, Member, IEEE, Shiwen Mao, Member, IEEE, James O. Neel, Member, IEEE,
and Jeffrey H. Reed, Fellow, IEEE

Abstract— The selection and design of performance metrics
and utility functions is an important, but inadequately addressed
issue in the design of cognitive radio networks. Unlike traditional
radios, a cognitive radio may change its objectives as radio
scenarios vary. Because of the dynamic pairing of objectives and
contexts, it is imperative for cognitive radio network designers
to have a firm understanding of the inter-relationships between
goals, performance metrics, utility functions, link/network performance, and operating environments. In this paper, we first
overview the hierarchical metrics for evaluating the performance
of cognitive radios from the node, network, and application
levels. From a game-theoretic viewpoint, we then show that the
performance evaluation of cognitive radio networks exhibits the
interdependent nature of actions, goals, decisions, observations,
and context. We discuss the inter-relationships between metrics, utility functions, cognitive engine algorithms and achieved
performance. Various testing scenarios need to be employed to
comprehensively evaluate the cognitive functionality of cognitive
radios. We propose the radio environment map-based scenariodriven testing (REM-SDT) for thorough performance evaluation
of cognitive radios. An IEEE 802.22 WRAN cognitive engine
testbed is presented to provide further insights into this important
problem area.

narrow sense, comprehensive situation-awareness and intelligent learning capability, which have not been realized or fully
exploited by current radios, are two defining features of future
advanced CRs.
The CR paradigm will drive the next generation of radio
devices and wireless standards to enable a variety of new applications in demanding environments, such as spectrum-sharing
networks, natural disasters, civil emergencies, and military
operations. Examples of CR-oriented networks include the following: the U.S. DARPA XG [41] and WNaN programs [69],
IEEE 802.22 WRAN [24] (aiming to support data services
in the TV bands as secondary users), IEEE 802.16h (aiming
to improve coexistence mechanisms for license-exempt operation), IEEE 802.11h (supporting dynamic frequency selection
and transmit power control for WLANs to share spectrum),
802.11y (enabling Wi-Fi like equipment to operate on a
secondary basis in licensed frequency bands), European endto-end reconfigurability (E2 R) research program [20], and the
networks proposed by the White Spaces Coalition [2].

Index Terms— Cognitive radio, cognitive wireless network,
game theory, performance evaluation, performance metric, utility
function.

A. Building Blocks of CR

I. I NTRODUCTION
Formalizing a decades-long trend towards radios which selfoptimize in response to changing conditions, cognitive radio
(CR) and cognitive wireless networks define a design paradigm
for introducing numerous new adaptive algorithms, which
enable much higher spectrum utilization, provide more reliable and personal radio services, reduce harmful interference,
and facilitate the interoperability or convergence of different
wireless communication networks. The term “cognitive radio”
was initially coined by Dr. Joseph Mitola III in late 1990s [42],
[43]. In a broad sense, some preliminary CR technologies (e.g.,
adaptations in transmit power and dynamic channel selection
in response to varying RF environments) have already been
employed in many existing wireless networks [66] such as
cellular networks, cordless phone systems, and WLANs. In a
Y. Zhao is with the Shared Spectrum Company, Vienna, VA 22182. Email:
yzhao@sharedspectrum.com.
S. Mao is with the Department of Electrical and Computer Engineering,
Auburn University, Auburn, AL 36849. Email: smao@ieee.org.
J. O. Neel is with Cognitive Radio Technologies, Lynchburg, VA 24502.
Email: james.neel@crwireless.com.
J. H. Reed is with the Bradley Department of Electrical and Computer
Engineering, Virginia Polytechnic Institute and State University, Blacksburg,
VA 24061. Email: reedjh@vt.edu.

The building blocks and overall system flow of the CR
model considered in this paper is illustrated in Fig. 1 [82]. In
this model, the radio environment map (REM) is an integrated
information structure (i.e., a database) that consists of multidomain information for CR (or, metrics), such as geographical features, available services, spectral regulations, locations
and activities of radio devices, policies of user and service
providers, and past experience [80], [83]. During operation,
the CR observes the operational environment via sensor(s),
and obtains necessary situation awareness about the current
radio scenario by leveraging the sensing results and REM.
The “brain” or intelligent agent of CR, the cognitive engine
(CE), then determines an appropriate utility function based on
the policy and the goals, by considering the specific application
or radio scenario. The utility function maps the current state
of the CR, usually represented by an array of chosen metrics,
to a value for indicating how close the state is towards the
desired (or optimal) CR state. The most pertinent performance
metric(s) should be taken into account and incorporated into
a proper utility function to meet the CR’s goal for the specific
radio scenario or application.
By leveraging past experience and knowledge, the CE can
choose the most efficient reasoning and learning method and
make (near-)optimal (cross-layer) adaptations subject to constraints of regulation, policy, and radio equipment capability.

since a CR network is generally guided by numerous competing objectives. but it does highlight the challenge in objective evaluation of a CR system. care must be taken in selecting performance metrics as the selection will impact many aspects of CR design. and to trade-off various (possibly conflicting) objectives. since most CR designs assume cross-layer adaptations. and decision-making processes. we demonstrate CR design trade-offs by examining the interaction among performance metrics. Finally. and the achievable CR performance is examined from a game-theoretic view in Section III. Third. Similarly. Researchers and standardization bodies generally agree that CR should be able to sense the environment and autonomously adapt to changing operating conditions. CR performance benchmarking is also needed by vendors for type approval testing during the development and production of CRs. B. In Section II. Such diverse expectations make performance evaluation a great challenge in the design of CR devices and networks. The dynamic interplay of changing environments. network. traditional layered testing under static conditions will yield misleading results. but also to existing/emerging network operators and spectrum regulators. [84]. as illustrated in Table I. the wireless community has not converged on a common definition at this time. since the performance of a CR may change over time as it learns and adapts to the environment. Environment “Aware” Goal Driven Learn the Environment √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ √ No harmful interference Adaptive eceiver √ √ √ Negotiate Waveforms Adaptive Transmitter √ √ √ Capabilities “Aware” Can sense Environment FCC [13] Haykin [21] IEEE P1900 [25] IEEE USA [26] ITU-R [29] Mitola [43] NTIA [10] SDRF CRWG [62] VT CRWG [70] Autonomous Definer Adapts (Intelligently) D IFFERENT A SSUMPTIONS ABOUT W HAT C APABILITIES A RE N ECESSARY FOR R ADIO TO B E C ALLED A C OGNITIVE R ADIO √ √ √ √ √ . but mainly differ in the levels of situation awareness and cognitive functionality. While a great number of choices are available. there is a broad range of performance metrics that could be defined and has been used to evaluate CR performance. This is not a problem unique to CR. First. [35]. benchmarking the performance of CRs when coexisting with incumbent radio devices and systems is badly needed by (spectrum) regulators to provide a basis for certifying and regulating CR. and investigates how to efficiently evaluate or validate CR performance under various testing scenarios. In Section IV. Second. [15]. different people and organizations have different expectations of what level of intelligence and what capabilities are essential to or merely beneficial to the CR concept. which enables the closure of the CE learning loop. the CE may adopt dynamic situation-aware utility functions rather relying on a single predefined static function. The inter-relationship between performance metrics. We focus on the performance metrics appropriate for dynamic spectrum access and sharing. military. Motivation C. Regulators need an effective and efficient way to demonstrate that CR devices or systems will not generate harmful interference to incumbent users. To be flexible to changing radio scenarios. Specifically. UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO TABLE I 2 Performance feedback is collected from other radio nodes or by sensing the environment. The rest of this paper is organized as follows. The case library. Without well-defined performance metrics and benchmarking methods. examines how CR performance can be tuned in the desired direction by defining and adopting proper metrics and utility functions. knowledge base and REM are updated according to the observed performances results [82]. it would be almost impossible to put CR technologies into practice. goals. In fact. this paper reviews candidate CR performance metrics at the node. we first define the hierarchical structure and then provide an overview of performance metrics from various perspectives. utility functions. benchmarking a CR will tend to be a subjective process as users running different applications in their perspective environments will assign different weights to metrics. Formalizing performance metrics and evaluation methodologies will also greatly help the research community to make meaningful comparisons between different CE algorithms and to further accelerate the advancement of CR research. Scope and Organization Although the notation of CR has been introduced in late 1990s. We believe that addressing these important issues would be of great interest not only to CR designers and standardization bodies. CR performance benchmarking is needed by service providers for CR In this paper.INVITED PAPER. which have attracted considerable research efforts in the past few years and have significant and immediate impact to commercial. and application levels. and public safety radios [8]. and capabilities (due to learning) means that creating a generic benchmarking method for CR is non-trivial. establishing or selecting effective performance metrics (called “meters” in [61]) is usually one of the most important and challenging steps towards a successful CE design. [19]. measurements taken at one time may not be indicative of its performance at a later time. First. For CR researchers. Since a CR may support many disparate applications. Formalizing CR benchmarking methods and performance metrics would help hasten the integration of the CR paradigm into existing wireless networks. network deployment and maintenance as well as spectrum trading/subleasing. utility functions.

For example. In general. a toddler limited adaptation capability. remaining issues. and context awareness [80]. power supply and energy efficiency. radio equipment vendors.INVITED PAPER. UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO Cognitive Engine Sensing Receiver(s) Policy and GoalDriven Utility Function Local REM 3 Situation Awareness High-level Reasoning and Learning Radio Transceiver(s) Low-level Fine-Tuning with (Multi-Objective) Optimizer(s) Cooperative Learning with REM Dissemination Adaptations at various Layers REM. network. 1. learning. A N OVERVIEW OF CR P ERFORMANCE M ETRICS In this section. while an adult behaves autonomously to reach his/her goals even without . [53]. We focus on such CRspecific metrics in the following. a CR node can be evaluated from the following four domains: (i) cognitive functionality. which provides first-hand experiences and insights into the methodology of CR networks evaluation. and planning. spurious free dynamic range (SFDR) of the front end. such as reasoning. we review performance metrics that can be used for CR performance evaluation at the node. Two complementary metrics are used to evaluate a CR’s ability to gain meaningful information about the presence of a PU–probability of detection (PD ) and probability of false alarm (PF A ). 2) Metrics for Accessing Cognitive Functionality: There are different ways to evaluate the cognitive functionality of a CR. Concluding remarks. Regulators. some performance metrics are specifically defined for CR. [39]. an infant CR may have limited aware capability. cognitive functionality may include sub-domains of situation awareness. the most critical piece of information for secondary users (SUs) is the presence of PUs. CR network operators/users. It is possible to improve PD without sacrificing PF A by employing cooperative sensing [18]. A CR node may also be aware of the performance of radio devices (both other devices and itself). Knowledge Base Case Library Performance Evaluation and Feedback Building blocks of the CR model considered in this paper and the role of performance feedback and utility functions in CE. “Radio IQ” can be defined to different metaphorical levels of cognitive capability. which we nickname “node score card” and “network score card. In Table II and Table III. cycles and memory required to implement a particular waveform. location accuracy. reasoning. For example.” respectively. 1) Metrics for Situation Awareness: The performance of a CR network is highly dependent on the extent and quality of information about the radio environment available at each node. II. For dynamic spectrum access and spectrum sharing applications. and continuity are metrics for evaluating the location awareness of a CR node. This includes information such as the linearity of radio transceiver. The more general case will be discussed in Section III where utility functions are used to unify multiple metrics. as shown in Table II. This environmental information (situation awareness) can include information (awareness) about the following: location. different metrics are needed for these perspectives.22 WRAN Base Station (BS) CE Testbed. mobility and trajectory. Each type of information has an associated set of metrics for evaluating the quality or extent of the information. (iii) complexity. standard organizations. integrity. such as channel evacuation time and probability of primary user (PU) detection. Each of these metrics can be used in the CE to drive the operation of a CR. a higher PD provides greater protection to PUs but is accompanied by a higher PF A which tends to lead to less efficient spectrum utilization. (ii) overall node performance. The receiver operation characteristics (ROC) of a CR depict the PU detection rate vs. and application levels. noise power. Each domain may consist of a set of sub-domains or key metrics. Node-Level Metrics Generally speaking. and (iv) technical maturity. geographical environment. and learning [63]. and suggested directions for future CR research are provided in Section V. regulation and policy. either a targeted PD or PF A should be set. we provide extensive lists of candidate performance metrics for CR node and CR network. availability (in time and space). For example. RF environment. planning. adaptation. mission. power and frequency of intermodulation products. battery life and power consumption. we present an IEEE 802.or Model-Based Performance Prediction Global REM at network infrastructure and/or Local REM(s) at other CR nodes Fig. as opposed to generic performance metrics that have been widely used in the literature. a preschool limited learning capability. priority. Among these metrics. Depending on whose interest is of priority. decision making. and legacy radio network operators/users may have different concerns about CR performance. an adolescent may avoid making repeated mistakes. Therefore. the false alarm rate under various signal-to-noise ratios (SNR). A.

Software Defined Radio [59] . [63]. bandwidth efficiency. Channel coherence times and link SINRs influence link reliability and thus influence the selection of routing algorithms which is better suited for the level of disruption and mobility. training time: Learning period [76] Spectrum utilization (in terms of sum throughput. [77] .” “preschool. bit error rate (BER). i. Example link layer metrics include the following: collision rates. and priority awareness [62].” “toddler.Multi-band RF transceiver and antennas .” “adolescent. policy.. position and link gain information can help improve the performance of topology formation algorithms. . [84] Learning capability .Case retrieval time Decision making capability . integrity. mission. It should be noted that this latter classification scheme does not assume a strict progression.” “teenager. (vi) conducts negotiations. Dr. Overall CR node technology maturity Maturity of key technologies: .(Total) spectrum sensing time for PU detection for a given sensitivity [72] . a CR may be able to adapt protocols (level ix) but be unable to negotiate (level vi). We briefly discuss representative performance metrics at different layers. they can also be useful for the decision processes in higher layers. (v) capable of planning.Analog-to-Digital Converter [36].Required SNR for PU detection at certain detection rate and false alarm rate [18] . At the physical layer. call setup time or call blocking rate.Artificial Intelligence [15]. Decision Making Planning. [80] Channel evacuation time when PU (re)appears [41] Cross-layer adaptability Operation channels/bands and switch time between operational channels or bands [41] Antenna pattern adaptability [23] Dynamic range at receiver and transmitter Waveform/air interface flexibility and reconfigurability Routing protocols adaptability Overall radio IQ level: “infant. overhead ratios.Transfer (net utility loss of the other nodes caused by one CR node) [12] .Spectrum opportunity tracking and prediction [78] . false alarm rate under various SNR and certain time bandwidth product [53]. and ARQ request rate. packet drop rate. memory footprint. and Learning Overall Node Performance Node Complexity Technical Maturity Performance Metrics Location accuracy. (ii) goal driven. network available time) Impact to other SU nodes or incumbent radios.Distributed or centralized decision making [79] . etc. availability.” “young adult” [63] Reasoning capability [83] . FER.Radio channel condition (such as multi-path delay spread and Doppler spread) awareness [57] Mobility and trajectory awareness [80] Power supply and energy efficiency awareness [80] Regulation. which however. frame error rate (FER). commonly used metrics include: signal to noise plus interference ratio (SINR) or interference to noise ratio (INR). battery life) Communication cost for end user Link reliability (in terms of BER. [16]. [52] . UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO 4 TABLE II E XAMPLE N ODE -L EVEL P ERFORMANCE M ETRICS ( OR ”N ODE S CORE C ARD ”) FOR S PECTRUM -S HARING N ETWORKS Domain Situation Awareness Cognitive Functions Adaptation Capability Reasoning.SINR or INR [41] Power efficiency (in terms of active time. Network layers metrics may include mean and peak packet delay. mean channel access times. implementation costs.INVITED PAPER. or packet drop ratio) [66] In terms of signal processing power requirement.Case/knowledge/policy-based reasoning capability . (iii) context aware.Receiver Operation Characteristics (ROC): PU detection rate vs. (viii) adapts plans. and power efficiency. (vii) learns environment. context.Policy conformance enforcement [54] . 3) Metrics for Accessing Node Performance: Numerous direct or indirect observations could be used by a CE to evaluate the overall performance of a CR node. are also important for CR performance evaluations [63]. routing table or routing path change rate (for ad-hoc and sensor networks). (iv) radio aware. For instance. and continuity at various (outdoor/indoor) environment [11] RF environment awareness: . While these metrics are all derived from physical layer observations.Effectiveness of learning: performance vs.Flexibility of learning (type of learning methods supported) [16] . [51] inputs from others. Many of these metrics have been used in wireless network performance evaluation in the literature. Mitola defines a different functional classification of CR as follows: (i) pre-programmed.” “child. and (ix) adapts protocols [43].Decision-making algorithm convergence time [83]. in terms of .Signal classification/recognition functionality [14] .e.

origination failure. where ∆F is the minimal necessary frequency band to permit the functioning of the operational facilities of interest. FTP or distributed database synchronization) In terms of signal processing power requirement. We can then compute SU E using (2). The XG network forms and maintains dynamic connectivity in terms of network formation/rendezvous time .g. distortion. In the first method.Network availability [41] Network reliability.Vulnerability to denial-of-service attack [5] Application QoS .Throughput of secondary system [17]. SU awareness: PU/SU detection rate. (iii) complexity. routing protocol and awareness of network status and current goal Routing protocol and topology adaptability Cross-layer adaptation capability Overall network IQ level Decision-fusion overhead [81] Decision-fusion time Spectrum utilization in terms of . and Learning Overall Network Performance Network Complexity Technical Maturity PU. spectrum utilization is determined by the amount of frequency.Video quality. [54] . etc. [66] . 2) Metrics Adopted by the DARPA XG Program: The U. and time (even code) domains . mean opinion score (MOS) [28] .Increased average packet delay experienced by the incumbent PUs [86] .“Packing factor” (or “idle percentage”) in frequency. and ∆F0 the frequency band being analyzed. have similar definitions to the corresponding node-level metrics.g. and mobility support [56] Network security . media delivery index (MDI) [73].g. Some metrics. (2) (3) The XG network causes no harmful interference to nonXG systems in terms of abandon time (i.Robustness to malicious node [6]. These performance metrics were defined for the following three scenarios [41]. (ii) overall network performance. The lower bound for U is achieved by determining the ∆F of the optimum or near-optimum frequency use algorithm. or BER degradation at the PU receiver [86] .g... [31]. [31].e.SINR.Rendezvous time (network access time) [41] . packet drop rate. being evaluated over the entire network.Spectrum utilization efficiency [27].e. S the geometric space. Network-Level Metrics Similarly. and peak-signal-to-noise-ratio (PSNR) . geometric space and time used and may be calculated using U = B × S × T.Policy conformance enforcement [54] B.Response time for interactive data applications (e. jitter [9] Networking metrics: . Overall CR core network technology maturity Maturity of key technologies for CR networks: . [32]. handover failure. and (iv) technical maturity. space.Increase in call drop rate..Sum throughput or goodput [9]. Decision Making.. INR. memory footprint. [32] Impact to PU networks or other co-existing SU networks [58].Throughput for bulk data applications (e. B the spectrum bandwidth. DAPRA XG program used a different set of performance metrics during the 2006 field tests.MANET (scalability) [56] . and T the time. Telnet or World Wide Web) . • 3 where U is the amount of spectrum space used (Hz ×m ×s). delay. [78] . e. We focus on the metrics for spectrum utilization efficiency and the DARPA XG program here. false alarm rate [11] Policy awareness [80] Awareness of adaptation capability of each node in the network Awareness of network topology. U = ∆F/∆F0 . [76] . 1) Metrics for Spectrum Utilization Efficiency: The ITUR Handbook on spectrum management presents two different methods for calculating spectrum utilization efficiency (SUE) [27]..S. scalability. termination failure [66] End-to-end metrics: average throughput. maintain less than 3 dB SNR degradation at a protected receiver). abandon a frequency channel within 500 ms) or interference limitation (i.INVITED PAPER. we may evaluate a CR network in the following four domains: (i) cognitive functionality. UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO 5 TABLE III E XAMPLE N ETWORK -L EVEL P ERFORMANCE M ETRICS ( OR ”N ETWORK S CORE C ARD ”) FOR S PECTRUM -S HARING N ETWORKS Domain Performance Metrics Situation Awareness Cognitive Functions Adaptation Capability Reasoning. as shown in Table III. implementation costs.Voice quality. e. SUE is computed as the ratio of information transferred (denoted as M ) to the amount of spectrum utilized as • SUE = M/U = M/(B × S × T ). (1) The second ITU-R method is based on a special procedure for re-designing the frequencies of operating radio stations and use (3) to calculate the spectrum utilization.

and (iii) from theoretical rate-distortion models. MDI should be used with caution. We review representative metrics in each category in the following. [50]. UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO (establish an XG network of 6 nodes within 30 seconds). network join time (join a node to an existing XG network within 5 seconds). but also because they unifies the impacts of most of the lower layer performance metrics. for two MDIs with identical MLRs. This is not only because they are the ultimate performance measure. k) and x¯(i. However. MEPG video. Functions with simpler forms can be used to ln αD approximate this rate-distortion function [7]. However. j. Furthermore. application metrics are similar to utility functions that will be discussed in Section III.g.j.500-11 [30]. such approach is more expensive. MDI should be used as an indicator for inferring media quality only. Such evaluation is easier to carry out than subjective approaches and the results are easily repeatable.k [x(i. From this perspective. while application metrics aim to model perception of human users. The measurements are easy to carry out with low complexity [73]. c) Rate-Distortion Model Approach: If we assume that the source statistics are Laplacian distributed and the distortion measure D(x. making PSNR not suited for live video applications. Their opinions are collected and analyzed [55]. Subjective video quality represent the ultimate user perceived performance. Computed from network measurements. where x(i. which are highly diverse from each other. k)] i. j. x¯) = |x − x ¯|. Since the MDI is measured at the receiver. and difficult to carry out and repeat. ω. 2) Objective Performance Metrics: Objective video quality measures are obtained by directly analyzing the received video flow or the reconstructed video.. the above computation requires the original video. j. Agilent’s N2X Platform [1]). a sufficiently large number of experts view a chosen video sequence (alone or contrasting with the original video clip). and channel re-establishment time (reestablish an XG network of 6 nodes within 500 ms). (ii) from reconstructed video frames. Various video performance metrics can be roughly categorized into the following two classes: subjective video quality and objective video quality metrics. a spectrum-hungry application. For a video sequence of K frames each having N × M pixels with L-bit depth. which are reviewed in the following. the subjective video quality may still be very different. C. Many objective video quality metrics are proposed and adopted in the past (see [44]. k) are the pixel luminance value in the (i. the overall distortion of a reconstructed video Dd can be decomposed into two parts: the distortion caused by signal compression De and the distortion caused by transmission errors Dv as [67] Dd = D0 + ω/(R − R0 ) + κ × p. Such rate-distortion function approach provides closed-form expressions that translates network delivery metrics to video distortion. and a data loss at-a-glance measure for a particular flow [73]. • The XG network adds value in terms of reducing spectrum management setup time (no pre-assigned frequencies increase deployment flexibility) and increasing spectrum access (communications capacity) in terms of 60% or more spectrum occupancy with a 6-node XG network. We take video. | {z } | {z } De (5) Dv where D0 . Note that the above metrics defined for the XG program are used as a threshold for establishing early confidence in the viability of dynamic spectrum access technologies [41]. j. as defined in RFC 4445 [73]. or other information sensitive to arrival time and packet loss. 1) Subjective Performance Metrics: In subjective video quality evaluation. and p the packet loss probability. This approach has been adopted by the ITU-R as in ITUR BT. It is therefore very useful for theoretical studies . as an example in this section and review performance metrics used in the literature and practice for evaluating the quality of video delivered through a network. making it useful for live video applications. [65] and references from therein). R0 . For streaming video. MDI is independent of video codec and video sequence. there is no need for the original video. However. It has also been found that sometimes the PSNR does not conform to the subjective video quality very well. The test results also heavily depend on the expertise and preferences of the viewers. Compared with other metrics. Voice over IP. as well as many other factors such as room illumination and display type. It has been implemented in commercial devices (e. a measure of deviation from normal flow rates. respectively. It consists of a twotuple separated by a column: a Delay Factor (DF) and a Media 6 Loss Rate (MLR). then there is a closedform expression for the rate distortion function as R(D) =  1 [71]. since it has been shown that loss pattern has a significant effect on video distortion [37]. j) location in the k-th frame for the original and reconstructed videos. PSNR is easy to compute and well understood by many researchers. application performance metrics are quite different from generic network-level metrics. The two-tuple provides an indication of traffic jitter. utility functions are based on the abstract (and loose) concept of utility. b) Peak Signal Noise Ratio (PSNR): PSNR is a Mean Square Error (MSE) based metric that measures the quality by simple pixel-to-pixel comparisons of the reconstructed video with the original video.INVITED PAPER. and κ are coefficients to be determined for a specific video codec and video sequence. Such metrics can be roughly classified according to how they are computed: (i) from the received video packet flow. It is a great challenge to define proper performance metrics for general applications. PSNR is computed as: ( P 2 1 M SE = N ×M×L ¯(i. k) − x (4) L2 P SN R = 10 × log MSE . It is designed as a quality indicator for monitoring network intended to deliver application such as streaming media. which effectively unifies the influences of the entire protocol stack. Video quality is usually a very complex function of the network layer statistics. a) Media Delivery Index (MDI): MDI is a metric computed from received video packet flow. Application Performance Metrics Although listed in Table III. On the other hand.

where Ai specifies the action set of player i. {ui }i. utility is an assignment of values (numbers) to the current operating state such that the closer the CR comes to satisfying some goal. ai to refer to the action chosen by player i. or a logistic function (i. the utility function could be more complicated. a nonmonotonic (convex or concave) function. latency. 1. {ui }. goals. where N denotes the set of players (CRs) of cardinality n and i ∈ N specifies a particular player. The issue of scaling indicates that some degree of decision distribution will be necessary even if some adaptations are capable of being performed in a centralized (or collaborative) manner. the term “decision rule” refers to some well-defined process that controls a CR’s adaptations . {ui } is the set of utility functions. For example. F ROM M ETRICS TO U TILITY F UNCTIONS – A G AME -T HEORETIC P ERSPECTIVE As shown in Fig.g. Generally speaking. ui : A → R. It is also worth noting that the average video packet loss rate (p) is used. context (C) . genetic algorithms or local searches.g. goals (G). 1) Network Performance Considerations: If we ignore learning processes which are not generally assumed to running while a network is active [43]. how nodes choose their utility function can significantly impact network behavior. minimize device power consumption. position.. or the error function). Note that in a multi-layer CR. varying any one of these parameters . each CR represents a player. decisions. this can be considered to subsume the models used in reasoning processes such as models of the environment. To further complicate matters. With this model it is sometimes convenient to use d(a) to refer to the collective application of di (a) at the times specified by T . However. perhaps in individual clusters. at which the decisions are implemented. the utility functions used by a CE could be very simple and defined by some basic function of the goal. the decision process could reside on a single radio or conceptually span a cluster or the entire network.g. Utility functions can incorporate a large number of performance metrics and are usually dynamic and application-specific. e. These are quantified by utility functions. Thus. power spectral densities (PSDs). UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO of video streaming systems [33]. i. the assignment of a real number to every possible combination of choices of actions by the radios to describe the values which the radios assign to points in A. 2) Modeling Networked Behavior: Because of the interactive nature of CR networks. a sigmoid. • Operational context – the conditions in which the CR network operates. maintain a connected network.g. hN. to which the radio is constrained by policy. As illustrated in Table IV. This wide variation in outcomes can be mitigated if we utilize an omniscient centralized controller. power. but in practice all observations could be imperfect and a completely centralized solution may not offer the requisite level of responsiveness when scaled to large networks. A. mission. When the goal is characterized by multi-objective or competing objectives. The coefficients cannot be directly derived from commonly used signal statistics. and a quantification of each CR’s goal supplies the utility function for the associated player. the goal is characterized or dominated by a single objective. A single iteration of adaptations by a network of CRs can then be modeled as a normal form game. In general. or network topology. we first discuss the interdependent nature of actions.e. game theory is an important tool for system modeling and analysis. utility function and goal are important components of a CE.. frequency. propagation environment. To give an intuitive feel for what we are modeling. minimize latency.INVITED PAPER. or the network.. capability. and goals while achieving desired behavior is a nontrivial task. T . III. designing cross-layer CR networks to operate under varying policy constraints. • Available actions – the various adaptations. e. the greater the value assigned to the operating state. In a traditional game model of a CR network [45]. T i [45]. such as a monotonic function. Γ = hN. {di }. In these situations. As we show in the following. the behavior of a CR network is influenced by the following factors: • Observations – the measurements or metrics. by which CRs gain awareness of their operational environment. collision frequency. multiple access techniques. contexts. radio capabilities. e. • Goals of the radios – the objective(s) which guide the decision processes. and context.g. and a−i to refer to the vector formed by considering all actions other than the action chosen by i. arc-tangent. A. 7 Additionally. e. to form the extended modeling tuple.. actions (A). A represents the adaptation space formed as A = A1 ×· · ·×An .observations (O)..can lead to radically different outcomes even when the remaining parameters are held constant. decision process (D). The impact of different loss scenarios is not modeled here [37]. Section IV presents more details about the features of some utility functions employed by a prototype CE. we use a to denote an action vector wherein each player in the game has chosen an action. higher layer topology choices will dictate which link gains most influence the adaptations of the PHY... Observation processes could reside on a single device or be formed through the collective behavior of many devices. A. observations.e. Challenging Issues for CR Design In the following. • Decision processes – the algorithms which map observations to adaptations. backoff timers. it still remains to determine the coefficients. This basic model can be extended by considering the specific decision rules. but need to be estimated by fitting the model to a subset of measured data points from the distortion-rate curve [67]. maximize SINR. and/or operational requirements. We then present insights into these interdependencies from game models of cognitive radio networks. di : A → Ai that guide the radios’ adaptations and the decision timings. the adaptations of one layer could alter the operational context of the processes controlling other layers. For notational convenience. how utility functions impact network behavior can vary from situation to situation. hyperbolic tangent. the adaptations available to each CR form the action set of its associated player. e. For some CR applications or scenarios.

then we know that Nash Equilibria  (NE). we can gain powerful insights into network behavior by examining just the submodel Γ = hN. a game for which there exists a potential function V : A → R such that all utility functions satisfy (8).e. action vectors a∗ such that ui (a∗ ) ≥ ui bi . some CRs are not implemented with welldefined decision rules and are instead only lightly governed by goals. even though an exact characterization of a network’s behavior depends on all the parameters in hN.INVITED PAPER. a−i ) > ui (ai . For example. when a CR network can be modeled as a supermodular game [45] decision rules which locally optimize each . we know that when Γ is a potential game. To handle both of these cases. an exhaustive better response dynamic occurs when all decision rules satisfy (7). ∀i ∈ N. D ECISION PROCESSES (D). Note that establishing this broad convergence of decision rules is a property unique to potential games [45]. A. bi 6= ai ⇒ ui (bi . For example. bi ∈ Ai . i. a−i ) − V (ai . a−i ) = V (bi . G OALS (G). Similarly. isolated steady-states will be stable as −V constitutes a Lyapunov function [45]. For instance if we assume that the radios Parameters Parameters Result Parameters Result Result Parameters Parameters Decisions Result Scenario 2 Result Parameters Parameters Result Parameters Actions Observation Scenario 1 O Interference at access point from other access points A Frequency (channel) D Lowest interference channel G Minimize interference C Tent city Converges to near-optimal frequency reuse pattern [46] O SINR at cluster head A Frequency D Maximize goal G Maximize SINR C Isolated cluster Network tends to converge to low interference states O Collisions A Transmission times D Collaborate on times G Maximize collisions C Isolated cluster Rapid convergence to minimal interference state. and available adaptations. With other normal form game models only more specific decision rules can be shown to converge. ai ∈ / di (a) if ∃ bi ∈ Ai : ui (bi . A CTIONS (A). A. a−i ) − ui (ai . {ui }i which neglects the contribution of decision rules and decision timings. ∀a ∈ A. Likewise. when we know that Γ constitutes an exact potential game. OR C ONTEXT (C) CAN LEAD TO RADICALLY DIFFERENT OUTCOMES Parameters Goals Result Parameters Context Result Result which has presumably been designed to increase the value of ui with each adaptation. Similarly. VARYING THE O BSERVATIONS (O). For example. then the NE will be the only steady-state. (6) A game theorist would refer to the behavior which results from the use of decision rules of this form as a better response dynamic. converges to target SINR [75] O SINR at cluster head A Power D Punish (jam) radios deviating from target SINR G Target SINR C Isolated cluster Network overcomes defection problems for significant improvement in performance [40] Result O Interference seen by clients A Frequency (channel) D Lowest interference channel G Minimize interference C Tent city Enters an infinite loop with probability 1 as network scales [49] O SINR at cluster head A Power D Maximize goal G Maximize SINR C Isolated cluster Network tends to converge to self-jamming states O Collisions A Transmission times D Noncollaboratively choose times G Maximize collisions C Isolated cluster Slow (if at all) convergence. a−i ). bi ∈ di (a). a−i ). we restrict our design framework to a set of decision rules which we term autonomously rational which satisfy (6). (7) Interestingly. adjustable to different user priorities O SINR at cluster head A Power D Maximize goal G Target SINR C Isolated cluster If target SINR is feasible. UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO 8 TABLE IV E VEN WITH ALL OTHER PARAMETERS HELD CONSTANT.. a decision rule may specify discrete steps up or down in transmission power in response to observed channel conditions or may specify a sequence of alternate frequencies to try when interference is detected. policies. T i. However. ∀i ∈ N. ui (bi . Likewise if we know that decision rules are exhaustive better responses. proper selection of radio utility functions can also be used to guarantee network convergence and stability under very broad assumptions about the radios’ decision rules. {di }. a−i ). will be steady-states for the network [45]. a−i ) > ui (ai . {ui }. a∗−i . (8) We know the network will converge as long as decision rules are autonomously rational and care is taken to ensure that adaptations are not synchronized (lest radios continuously repeat conflicting decisions). throughput as low as ALOHA (1/e) O SINR at cluster head A Power D Maximize goal G Maximize SINR C Isolated cluster Network tends to converge to self-jamming states O SINR at cluster head A Power D Punish (jam) radios deviating from target SINR G Target SINR C Isolated cluster with a jammer Network self-jams as it “punishes” the jammer are autonomously rational.

a potential game could be created by virtue of a Coordination Game wherein all radios choose their actions to maximize O(a). UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO adaptation (known as best responses in game theory parlance) will converge. observations. ∂ai ∂aj ∂ai ∂aj (9) A similar relationship also holds true for supermodular games with action sets which are compact convex subsets of the real number line R as shown in (10) ∂ 2 ui (a) ≥ 0. or objectives. Therefore. ∀i. useful insights about the behavior of a CR network can be gleaned by examining its associated game model. However. behavioral learning (e. For continuous.g. In fact. in [46] we first showed that a design objective of minimizing the sum of observed network interference levels in an 802. then we can significantly relax the constraints on our decision rules. Beyond serendipitous discovery of a function that satisfies (8). We have since applied similar techniques to permit the adaptation of power control. and in particular. the relationship between the design objective and the potential function in the third column of Table V should be first examined and then work backwards to the utility functions and implied observations of the second column. In general. the performance of a CR network can be improved if we permit the network to improve its performance post-deployment by refining its parameters. e. when the goals of radios are combined with noninteractive performance metrics where performance is only a function of the radio or cluster’s own adaptations.g. the existence of a potential function (and thus existence of a potential game) can be established by demonstrating that (9) holds true.. Conceptually. but in practice is just the radios minimizing their own observed interference levels. B. We refer interested readers to [16] for more details about these learning techniques. Thus such an approach would necessitate the additional design of some mechanism for providing this network-wide awareness. then the convergence and stability properties of the network are frequently not affected (though network steady-states may be significantly impacted). utility functions. Each learning method has its relative strengths and limitations. reducing the bias or variance of existing observations . Utility Function Selection Clearly. Thus if we choose utility functions for CRs to take one of those forms. In general for a network to be modeled as a potential game. In general. In a distributed system. Models can be refined for better predictions of performance and thus better decisions. ∀a ∈ A.. essentially. exhibits a natural pairwise symmetry to the interference terms. we will be broadly assured that the network will be convergent and stable. assigning the radios a utility function which seeks to directly optimize network-wide performance metrics requires that the radios be capable of observing network-wide performance. ∀i. consider prioritized transmissions.g. it would be odd if it did as what is “optimal” is quite subjective. and logic-based learning (e. decision rules.. induction or deduction). given an arbitrary design objective O(a). device power consumption by waveform. training a neural network or equalizer). So in general the actions. when working with a particular design objective in mind and when the broad convergence properties of a potential game are desired. and Learning A number of CR design problems can be characterized by several non-commensurable and often competing measures of performance. For instance. Utility. j ∈ N. However. [68]. the game is known to be a potential game. Instead. One downside to this approach is that a potential game is not guaranteed to converge to an optimal point. the CR decision-making process is a multi-objective or constrained optimization problem [38]. a certain symmetry between the radios’ utility functions must exist. the assurance of convergence and stability is not accompanied by an assurance of optimality so when adopting this approach care must still be given to the design of the observation processes which can depend on the choice of objective functions if we do not assume the existence of a common knowledge database such as the REM. Therefore. or more arbitrary nonlinear transformations. Interaction Between Metrics. Unfortunately. the Radio Environment Map (REM) [80]. other combinations can have significant effects on the behavior of the network. mission dependent. [61]. and operating context are all highly interdependent when examining the performance of a CR network. C. particularly when the CR network can be shown to be a potential or supermodular game. twice differentiable utility functions. however. and deployment in an ad-hoc network [48]. But if we start from the premise that our utility functions will satisfy the conditions of a potential game. some design objectives have some inherent structure such that the utility functions can be designed which only require locally available observations. ∀a ∈ A. ∂ 2 uj (a) ∂ 2 ui (a) = . 9 This permitted us to design an algorithm presented in [47] which employs utility functions that effectively correspond to the Bilateral Symmetric Interaction game. new contexts can be recognized and optimal adaptations learned.INVITED PAPER. learning to improve the accuracy and efficiency of observation and classification processes will generally not negatively impact system robustness. Several techniques have been proposed for combining goals including evaluating Pareto dominance.g. e. Alternately. synergistic combinations of these algorithms are expected to yield better results [83]. but suboptimal (though autonomously rational) adaptations can become trapped in loops in a supermodular game [45]. there is no generalizable technique for combining multiple goals as the ideal combination of goals will be heavily context. weighted sums of goals. when the radios’ utility functions take on one of the forms shown in Table V. For example. j ∈ N. determining if a network can be modeled as a potential game requires an examination of the interrelationships between radios’ utility functions. For instance. ∂(ai )∂(aj ) (10) More generally.11 network performing dynamic frequency selection. Learning can proceed by a variety of algorithms including case-based and knowledge-based learning [83]. or products of normalized goals.

3) u3 = power efficiency. (12) FFT Mode For OFDMA/TDD-based WRAN.i (aS ) = ωS. Metrics Selection In IEEE 802. thus making the developed CE algorithms generic. aj ) − Si (ai ). and performance evaluation for a CE testbed developed at Wireless@Virginia Tech [83]. ai ) P ui (a) = S∈2N :i∈S ωS.. it is difficult to guarantee continued robustness of the system. For example. aj ) = ωji (aj . IV. when learning spawns new processes.j (aS ). thus providing a fair and balanced combination of various performance metrics. To obtain a convenient measure of the available radio resource at a BS and the requested radio resource from CPEs. 2) u2 = spectral efficiency. RRUreq indicates the number of OFDM sub-carriers that needs to be allocated. and contexts. c ∈ R = C(a) P = i∈N Si (ai ) P P P = i∈N i−1 i∈N Si (ai ) j=1 ωij (ai . a unitless metric. Fast adaptation is critical for time-sensitive The global utility function for the WRAN CE testbed is defined by Y (ui )ωi .22 WRAN Base Station (BS) for real-time radio resource management of secondary users in currently underutilized TV spectrum. Similar to the geometric mean. was proposed. R is the data rate of the new connection (in units of “b/s”) and determined by the service type.16 WiMAX [24]. RRU . such as an active WRAN channel when PUs (re)appear. goals.INVITED PAPER. the services and QoS requirements are quite similar to those in 802. This metric is more important for multi-cell scenarios or a single cell serving a large number of CPEs. in terms of the number of available candidate channels after allocating radio resources to a given number of connections. β × BWsc (11) where α is the overhead factor (unitless) that takes the overhead of the WRAN protocol into consideration and can be determined by the WRAN system specifications. moving from a Rayleigh to a Ricean model when a LOS component is present) or by upgrading to more accurate algorithms (e. using a cyclostationarity approach over a simple windowed PSD for signal detection/classification) will generally lead to a system that more closely matches expected performance as observation variance has decreased. UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO 10 TABLE V C OMMON E XACT P OTENTIAL G AME F ORMS Game Utility Function Potential Function Coordination Game Dummy Game Coordination-Dummy Game Self-Motivated Game Bilateral Symmetric Interaction (BSI) Game Multilateral Symmetric Interaction (MSI) Game ui (a) = C(a) ui (a) = Di (a−i ) ui (a) = C(a) + Di (a−i ) ui (a) = Si (ai ) P ui (a) = j∈N\{i} ωij (ai . the global utility (uglobal ) is subdivided between individual CPE utilities (ucpe) and the normalized spectral efficiency of BS (uBS ) as ! ωN1 N Y 2 × uω (14) uicpe uglobal = BS .i (aS ) + Di (a−i ). decisions.. For the WRAN BS CE testbed. and ω1 and ω2 are the weights for the geometric mean of . we present selection of performance metrics. i=1 where N is the total number of active CPEs connected with the BS. The CE can be employed by an 802. utility functions. Utility Functions Selection A. Again. actions. j ∈ S V (a) V (a) V (a) V (a) V (a) perhaps by better matching models to context (e. where ωS. aj ) − V (a) = P S∈2N ωS (aS ) WRAN applications. Different weight vectors could be applied to adjust the utility function.22 WRAN.g. require additional overhead. β is the spectral efficiency (in units of “b/s/Hz”) jointly determined by the highest applicable modulation level and channel coding rate. 4) u4 = adaptation time when the CE is exposed to a new scenario. (13) accentuates low utility metrics. this will largely not be a problem for centralized or collaborative systems where we can conceptually integrate any combination of observation. in terms of the average utility of all downlink and uplink connections between the BS and customer premise equipment (CPE). In this section. a remaining questions is how to incorporate them in CR performance evaluations. BWsc = B. where ωij (ai . RRU is an abstraction from physical layer details (such as modulation/coding schemes and channel bandwidth). A PPLICATIONS OF P ERFORMANCE M ETRICS AND U TILITY F UNCTIONS IN AN 802. BWsc is the bandwidth of the WRAN OFDM sub-carrier (in units of “Hz”) as TV Channel Bandwidth . in terms of the transmit power of individual CPEs.g. The following performance metrics were considered for the CE testbed development: 1) u1 = QoS satisfaction of all connections. the required RRU for setup up a connection between the BS and CPE can be estimated by RRUreq = (1 + α) R . ∀i. [84]. This metric is more important for mobile or portable user devices or overlapping WRANs. but such systems might be limited in scalability. (13) uglobal = i where ωi is the weight applied to the i-th performance metric (ui ). and have longer response times.22 WRAN CE T ESTBED Having discussed the choices and interdependencies of performance metrics and utility functions. = C(a) = c. However.

9 Normalized Spectrum Efficiency of WRAN BS g 2 g 0. Pt0 ).6 0. R0 . as shown in Fig.7 0. The REM could also be used as a virtual “radio environment generator” together with other test equipment. For example. such as arbitrary waveform generators. C. respectively. Since solutions that result in excessively high QoS provide little value to the user. individual CPE utilities and the spectral efficiency of the BS. data rate. data rate. The normalized BS spectral efficiency (uBS ) is defined as 1 PM i u . respectively. 3. (15) where Pb . RRUcapacity is the maximal number of available subcarriers of a WRAN channel. and Pt are the measured or estimated BER. UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO 1 1 g1 0. 2. (17) M i=1 BS where M is the total number of channels supported by the BS. uiBS is defined as   RRUavailable − RRUcapacity i . R0 ) × f3 (Pt .6 0. and η and σ are the threshold and the spread parameter. and regulators because CR operates very differently from traditional radios due to its flexibility. respectively.4 0. [83]. 2 and Fig. One way to generate sufficient testing scenarios is to exploit the REM and apply the Monte Carlo simulation method to produce a large amount of testing . If a solution does not meet the target goal.3 0. 4. fi (x.95 when the metric (x) achieves the target (x0 ) and is 0. learning capabilities.5 0. Utility function (g1 ) employed in the WRAN BS CE Testbed and alternative utility functions.1 0 0 500 1000 1500 2000 Available RRU at a Given WRAN Channel 2500 Fig.4 0. as shown in Fig.3 0. We believe that the most accurate predictor of the future performance of CR is to emulate it in a similar situation. pb0 ) × f2 (Rb . not unlike the behavior-based interview. uiBS is also monotonic and bounded by 0 and 1.5 0. as illustrated in Fig. the solution will produce a lower BS utility (uBS ) if the CPEs are assigned to subcarriers spread into two or more WRAN channels as compared to the more spectral efficient solution in which the CPEs are assigned to subcarriers within the same WRAN channel.05 when the metric is one decade away. and Pt0 are the target BER. the individual CPE utility represents the user’s degree of satisfaction to the overall radio resource management.       x 1 tanh log − ηi σi + 1 . There are many candidate utility functions for defining the individual CPE utility and the spectral efficiency of the BS.2 0.1 0 −3 10 11 −2 10 −1 10 0 10 x/x0 1 10 2 10 3 10 f1 0. σi ) = 2 x0 i = 1. Pb0 . 2. and 3. (16) where x and x0 are the performance metric and its target value. The threshold (η) and spread parameter (σ) are chosen such that when the utility is 0. For the CE testbed. and transmit power (linear) of the CPE. [84].7 0. equipment vendors. x0 . 2. the employed utility function in CE testbed is monotonic and bounded by 0 and 1. Spectral efficiency function (f1 ) employed in WRAN BS CE Testbed and alternative utility functions.INVITED PAPER.8 3 f4 0. the utility is decreased sharply. We propose REM-based radio scenario-driven testing (REM-SDT) as a generic approach to evaluating the CE performance [83]. Rb . and the demanding or unpredictable operation environments. respectively. The modified hyperbolic tangent function is a type of sigmoid function that can accommodate a large range of performance variations and capture the value of the service to the user quite naturally [74]. 3. σRRU is spread parameter for the modified hyperbolic function.2 0. CE performance evaluation is very challenging for CR developers. the increase of utility is marginal. CE Performance Evaluation Methodology As discussed.8 3 g4 Normalized Utility 0. 3. uiBS is the spectral efficiency for the i-th WRAN channel uBS = where the RRUavailable is the number of available RRU for the i-th WRAN channel at the BS. For the CE testbed. which is defined as and also indicates radio resource utilization of this channel. The rationale to adopt such a modified hyperbolic tangent function (18) is that it helps the CE to squeeze the spectrum used by the WRAN BS through the optimization process. The CR under test is subjected to various realistic situations stored in the REM. respectively. (18) uBS = 1 + tanh σRRU −1 −1 −1 2 ucpe = f12 (p−1 b . The f -functions in (15) are modified hyperbolic tangent functions. and transmit power of the CPE. ηi . As can be seen from Fig. Fig. which could be in form of machine-readable XML files. There is a compelling need for new testing methodologies for CE under various radio scenarios.9 f 2 f 0. The weights can be determined by the WRAN operator based on its priority and goal.

the adaptation time of WRAN CE can be greatly reduced. of BSs Cell Radius No. when the required RRU from CPEs approaches the capacity limit of a WRAN BS). D. Illustration of REM-Based Scenario-Driven CR Testing (REM-SDT). of New Connections to be Setup Distribution of CPEs Types of Service Requested from CPEs and QoS (Target BER) Channel Model Multiplexing/Duplexing FFT Mode (TV) Channel BW No. Fast adaptation is critical for time-sensitive WRAN applications. the transmit power level of the BS and CPE. The case library and knowledge base may need to be updated accordingly when the utility function of CE changes. a number of new connections are added to the WRAN BS CE testbed and twenty five runs are conducted for each scenario. In the simulations. since the CPEs might be sparsely distributed in a very large area with the cell radius up to 100 Km. Note that not only the typical radio scenarios but also extreme scenarios (e. and the subcarrier(s) allocated for each connection. genetic algorithm (GA). exceeds the normal capacity of the BS) should be considered when testing the WRAN CE. the reason to choose geometric mean for the global utility is to provide “fairness” in QoS satisfaction for every CPE regardless its location in the WRAN service area. It also indicates that the impact of using different global utility functions would be more apparent for a CR network consisting of a small number of nodes. By leveraging the REM. 5. For example. which modulation and coding scheme to be employed by the BS and CPE. For the CE testbed. the rules and experience employed in CKL may only be useful for closely matched situations with the similar utility function. We employ REM-SDT to evaluate the performance of the WRAN CE through a series of test scenarios described in XML files [83]. target BER: 10−3 Low Data Rate: 250 kbps.. Our experiments with the CE testbed also show that the selection of global utility function (geometric mean vs. the geometric mean might be very low due to the large deviation of individual utilities.9. the CR tester could emulate various PU waveforms with certain usage patterns in certain frequency bands and then measure the performance of the CR under test through its RF emission. The CR tester can benchmark the performance of the CE under test against that of some baseline CR systems or some performance bounds.1] scenarios. This can indicate the cognition levels and the effects of its adaptations. 6 shows that the GA-based WRAN CE can produce a consistently better average utility than the LSbased CE or CKL-based CE. 4. of Total TV Channels Supported at the BS Protocol Overhead (α) RRUcapacity σRRU Weight Vector [ω1 . This is because GA is a generic search and optimization tool which is independent from or insensitive to the specific radio scenario and/or utility function in use. the WRAN BS CE adapts much faster when using the CKL algorithm than when using the GA. Average adaptation time [83].g.048 (2.INVITED PAPER. It also indicates that the LS and CKL may simply not be able to find a viable solution under some extreme radio scenarios (e. . Note that both GA-based CE and CKL-based CE have been implemented in conjunction with a local search for fine-tuning the final solution.g. target BER: 10−6 AWGN channel OFDMA/TDD (downlink to uplink ratio 3:1) 2.and knowledge-based learning algorithm (REM-CKL)1 . 5. sample average) has significant impact on the achieved performance of CR network nodes: when the sample average is maximized. The CE testbed can run different learning algorithms such as local search (LS). 0. TABLE VI S IMULATION PARAMETERS FOR THE WRAN BS CE T ESTBED Parameter Value or Range No.. and REM-enabled case. such as evacuating a TV channel for PUs. the number of active CPEs 1 Refer to [83] and [85] for more details about these algorithms.1 2048 800 [0. UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO 12 2500 REM−CKL GA LS Generate a set of radio scenarios (including waveforms) for testing 2000 CR (CE) under test REM Monitor the adaptations and evaluate the performance of CR (CE) Tester engine and GUI CR (CE) Tester Fig. CR Tester can be viewed as a CR as well. CE Testbed Experiment Results Simulation parameters for the WRAN BS CE testbed are summarized in Table VI.048 sub-carriers per TV channel) 6 MHz 8 0. especially under complicated situations. Adaptation time [ms] Waveform generator 1500 1000 500 0 0 50 100 Number of new connections 150 200 Fig. target BER: 10−6 High Data Rate: 750 kbps. Fig. It turns out to be a cost-efficient testing approach since possible problems can be identified before the CE is deployed in the real network. However. The CE makes decisions on which TV channel(s) to use. target BER: 10−2 Video: 100 kbps. ω2 ] 1 33 km 1 ∼ 256 Random uniformly distributed or clustered Voice: 10 kbps. As can be seen from Fig.

[12] A. pp. Circuits and Systems for Video Technology. It is critical to incorporate faithful PHY models into these tools (or to improve the their existing PHY models). Feb. “The white spaces coalition’s plans for fast wireless broadband. . NY: John Wiley & Sons. CA. Shankar N. Reston. Fattahi. D. Rabinowitz. Chen and J. Other interesting research issues include (i) development of a generic CR network simulation and testing tool (perhaps based on the REM-SDT approach). VA. For CR. and regulation. Most CR network simulations require involvement at both the PHY and higher layers. D. 2007. Kim. 1997. Bangeman. [6] R. no.25.” IEEE Trans. Erta. we believe that formalizing CR performance metrics and benchmarking methods will greatly advance the CR research and development. G. Mao’s research is supported in part by the National Science Foundation under grant IIP-0738088. 2005. and conclusions or recommendations expressed in this paper are those of the author(s).5 20 40 60 80 100 120 140 Number of new connections 160 180 Fig. goals. findings. This paper also illustrated potential difficulties in translating performance metrics into the utility functions which guide CR adaptations. “What and how much to gain by spectrum agility?” IEEE J.75 0.95 0. Sept. 2007. International Symposium on Advanced Radio Technologies. Alternate metric or refined utility function that can improve achievable performance of a WRAN merits further research.16 MAC for QoS support.” in Proc. decision processes. First IEEE Workshop on Networking Technologies for Software Defined Radio Networks.65 0. ION Institute of Navigation National Technical Meeting.6. Enge. (ii) standardization of the performance metrics and benchmarking methods for specific CR networks. 1993. CR performance evaluation is an open question. efficient. S. without greatly increasing the code complexity and simulation execution time. New York. S.85 0.25. pp. the commonly used wireless network simulation tools are mostly designed for traditional wireless ACKNOWLEDGEMENTS Y. We also saw that when utility functions are designed to have symmetrical relationships between the radios that network convergence and/or stability can be guaranteed for a broad range of conditions. J. “Performance evaluation of the IEEE 802. no.. Do. Perhaps the best way to evaluate/develop CR technology is to put it into practice gradually and improve designs based on realworld testing. “A new rate control scheme using quadratic rate distortion model. Cicconetti.3. [7] T. [3] M. and K.8 0.” IEEE Trans. or operating context can lead to radically different network behavior. and (iii) standardization of sets of REMs to facilitate comparison of CR algorithms and architectures among researchers. M. In this paper. Sethi. CO. the cognition capability relies on dynamically choosing the proper performance metrics and updating utility functions for decision making and learning. Zhang.576–588. [2] E.6 REM−CKL GA LS 0. 2006. 13 network simulation and mainly focused on layer-2 and layer-3. NY: John Wiley & Sons. We saw where varying the choice of observation processes. “Ensuring trustworthy spectrum sensing in cognitive radio networks. Shetty. Mingozzi. pp. UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO 1 0. vol.” Feb. technology maturity of CR is addressed as a measure to compare/evaluate CR systems. Jarvis. S. A. Lenzini. San Diego. X. Paganini. Apr. F. 6. [8] C.44–51. 03-108: “Facilitating opportunities for flexible. though optimality is a harder and more subjective condition to satisfy. [11] J. Zhao’s work had been supported in part by Electronics and Telecommunications Research Institute (ETRI). 26–38. and P. 3rd edition. and complexity as well as incorporating the dynamic environmental information is a challenging issue.” White Paper. Jan. Mobile Computing. [4] M. performance evaluation. Selected Areas Commun. Jan. Inc. vol. Chiang and Y. business and policy/regulation readiness are also open factors for evaluating the overall maturity of CR. J. “Mechanism-based resource allocation for multimedia transmission over spectrum agile wireless networks.1.601–612.1. and E. M. ranging from node-level to network-level and application-level. Apr. F. T. and C. 2nd edition.” in Proc. However. network operation. however. To evaluate the performance of CR for various purposes. Linear Programming and Network Flows. The performance metrics and utility functions employed in the studied WRAN CE testbed were defined in a heuristic manner. 2007. 2006.. The REM-CKL closely approximates the performance of the more complex GA solution. How to make tradeoffs between CR network simulation fidelity (for various layers). H.55 0. S. M. 2007. Feb. 2007. no. a broad spectrum of performance metrics at different levels could be employed.INVITED PAPER. H.. New York. and reliable spectrum use employing cognitive radio technologies. no. Average total utility [83]. vol. Apr. pp. vol. Chou. Brown and A. Park. 2007. “Potential cognitive radio denial-of-service vulnerabilities and countermeasures. V. M. Sherali. [10] Comments of the National Telecommunications and Information Administration on FCC ET Docket No. vander Schaar. pp. and Wireless@Virginia Tech Partners. available actions.” in Proc. REM-based scenario-driven CR testing was proposed as a promising approach to CR emulation and benchmarking. [9] C. Bazaraa. C ONCLUDING R EMARKS CR is an emerging research area that requires interdisciplinary collaboration and an overhaul of wireless systems design.” IEEE J.7 0. and H. Sherali. [5] T. “IPTV QoE: Understanding and interpreting MDI values. Based on our CR research and development experience.7. Fu. Selected Areas Commun. Texas Instruments (TI). R EFERENCES [1] Agilent Technologies. R. L. Any opinions.-Q. “Multi-fault tolerant RAIM algorithm for hybrid GPS/TV positioning.246–250. While the “definition” of CR is still under debate. Shin. reliability.” Ars Technica.9 Average total utility 0. Boulder. Nonlinear Programming: Theory and Algorithms. We saw that a game model of a CR network enables us to identify network steady-states even when decision processes are not well-defined. Bazaraa. 2005.3.

0. C. 2005. Thesis.” IEEE Personal Commun. H. and C. and reliable spectrum use employing cognitive radio technologies. Switzerland. Y. no.” in Proc. Neel. pp. W. “Draft standard for wireless regional area networks Part 22: Cognitive wireless RAN medium access control (MAC) and physical layer (PHY) specifications: Policies and procedures for operation in the TV bands. G. “Challenges: A radically new architecture for next generation mobile ad-hoc networks. . no. [41] M. O.11h helps WLANs share spectrum... [47] J. Mohamed. ACM MobiCom’05. [43] J. Farberand. vol. Geirhofer. Apostolopoulos. L. Liu. 2007.” Presentation at the BWRC Cognitive Radio Workshop. Springer-Verlag. Nov. [44] S.” IEEE Trans. [30] ITU-R Recommendation BT. University of Rennes I. and J. J. Nov. MacKenzie. B. and L.22 Draft Standard (D0.” in Proc. Nov. [66] S. Gaeddert. pp. “802. 2005. et al.” Working Document SDRF06-R-0011-V1. and N. “Improving spectrum usage through cognitive radio technology. 2007. Nov. B.5. R. [39] J. F. 2007. [20] D.22.” Guest editorial. TX.45.. Dissertation. 2004. Padhy. Selected Areas Commun. Huang and A. [60] T. [62] SDR Forum. [27] IEEE Unapproved Draft Std P1900. 2000. Fette (ed.D. pp. June 2006. Nov. Sadler. Bostian. Sweden. pp. “Definition of spectrum use and efficiency of a radio system. “Analysis of video transmission over lossy channels.” in Proc. 1979. Link. Mar. [26] IEEE USA. Oct. W. Whidborne.H. H. Goldsmith. Cologne. June 2000. W. Dec. Reed and C. [28] ITU-T Recommendation P. Washington. [58] J.” IEEE J. Bostian. Stockholm. Olsson. S. 66–72.1599. Hong Kong. J. Le.” IEEE P802. 2006. See. Reed.” in Proc. “Performance of cognitive radios: Myths and realities. Grandblaise. vol. Nov. Oxford University Press. TX. “Performance of distributed dynamic frequency selection schemes for interference reducing networks.C. [23] S. IEEE ICASSP 2003. Neel. Washington.” White Paper. B. 2006. under review. 2005. “Cognitive radio performance analysis. “Cognitive radio: Key performance indicators. M. 2007. July 2004. “Game theory in communications: Motivation.0. Mitola III and G.” IEEE J. Baltimore. “Understanding the issues in software defined cognitive radio. Aug. San Antonio.” in 2004 IEEE MTTS International Microwave Symposium Digest. MacKenzie. explanation. Aug.” IEEE J. Gilles. 2006. Nov. [63] SDR Forum. Doyle.0). Virginia Tech. “Objective methods for assessment of video quality: State of the art. May 2007.1/D01 (D2. 2003. 2002. [37] Y. Yang. IEEE DySPAN’07. H. Aug. [52] Y. IEEE WCNC’07.18.” in Cognitive Radio Technology. “Applying artificial intelligence to the development of a cognitive radio engine. [54] F. 2005. E. Reed. “The role of ontologies in cognitive radio. Prentice Hall. [21] S.00). “Draft revision of Recommendation ITU-R SM. 2005.6. 2003. [25] IEEE Standard P1900. Pinson. H. no. Reed. “Cognitive radio definitions. “Cognitive radio: brain-empowered wireless communications.” [33] S.69–77. Kompella. [17] S. Stuhlmuller. MD. [50] O. “A low complexity dynamic frequency selection algorithm for cognitive radio networks. Royal Institute of Technology. T. Cognitive Radio Technology. “Soft combination and detection for cooperative spectrum sensing in cognitive radio networks. T. vol. 2006. Lee. [19] A. Multiobjective Evolutionary Algorithms and Applications. MacKenzie and S.1599 – Determination of the geographical and frequency distribution of the spectrum utilization factor for frequency planning purposes. and B. [29] ITU-T Radio Communication Study Groups. Masud. Zhao. [14] A.739–742. “Online modeling of wireless channels with hidden Markov models and channel impulse responses for cognitive radios. “Optimization for cooperative sensing in cognitive radio networks. Khor. Nguyen. July 2003.” in Proc. S. IEEE Globecom’07.D. Gaeddert.500-11. MD. [59] J. pp. 2007. Multiobjective Optimisation and Control.” [32] ITU-R Recommendation SM. Rappaport. vol. SPIE VCIP’03. Ph. and J. “Analog-to-digital converters.821–825. Elsevier/Newnes. pp. 1997. and S.” Presentation at the BWRC Cognitive Radio Workshop.).C. [65] Shenick Network Systems. N. Liu. System Functionality.. Prentice Hall..4.862. 201–220. Neel.132–139./Sept. IEEE Milcom’06. IEEE Globecom’01. “Analysis and design of cognitive radio networks and distributed radio resource management. pp. Rieser.6. “Testing MPEG based IP video QoE/QoS. and H. Bostian. S. J. B. Liang. 2003. G. Springer-Verlag London Limited. “Non-intrusive cognitive radio networks based on smart antenna technology. Majumdar. IEEE Commun. Bae. Oct. “Dynamic spectrum access in the time domain: Modeling and exploiting white space. Virginia Tech. W. [18] A. Hong Kong.C. [38] G. D. Haykin.. Perich. pp. B.1012–1032. M. Wicker. pp.” Mobile and Portable Radio Research Group (MPRG) Technical Report. C. Research Studies Press LTD. [16] J. vol. Ramanathan. 2006. May 2007. A. Fort Worth. L. Apr.” IEEE USA Position.” SDR Forum Technical Conference.” 2001. F. IEEE DySPAN’05. Rondeau. Z. efficient. 2007.” IEEE Signal Processing. Fette (ed. 2007. 1999. and Spectrum Management. Multiple Objective Decision Making Methods and Applications: A State-of-Art Survey. [51] N. Nov. Rondeau and C.23. W.13–18. Fette (ed. Elsevier/Newnes. Tong. IEEE DySPAN’05. Reed. K. Mao. Nov.” IEEE Commun. Bostian. 2006. “XG dynamic spectrum access field test results. [36] B. 2007 by SDR Forum. H. 8. “Facilitating opportunities for flexible. Reed. M. B. [53] E. Software Defined Radio Forum.). “High-Speed ADCs critical to software-defined radio. McHenry. Ma and Y. 2007.” Cognitive Radio Technology. and A. Selected Areas Commun. K. Washington. UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO [13] FCC Notice of Proposed Rulemaking.” Working Document SDRF-06-R-0009.2. Dublin.” in Proc. Draft Recommended Practice for the Analysis of In-Band and Adjacent Band Interference and Coexistence Between Radio Systems. Reed. “Working document towards a preliminary draft new reort: Software defined radio in land mobile services (Question 230-1/8). Rennes. J. [35] W. May 2007. [22] C. Soliman. “Collaborative spectrum sensing for opportunistic access in fading environments. Dissertation. and J.487–495. P. D.). Lugano. Software Radio–A Modern Approach to Radio Engineering. “On path selection and rate allocation for video in wireless mesh networks.4. “Cognitive radio: Making software radios more personal. no. E.. [56] R. J. Apr. Papantonopoulos. Baltimore. J. Neel and J.51–57.” in Proc. [49] J.” Ph. C.” International Telecommunications Union Document 8A/121-E.45. Automatic Evaluation of Real-Time Multimedia Quality: a Neural Network Approach. MD. J. Kim.INVITED PAPER. Jan.D. Fehske. 2005. [45] J.5. Liang. Approved on Nov. Krenik. 2006.” in Proc. H. [48] J. vol. 2004. pp. 2005. Selected Areas Commun. R. [24] IEEE. pp. Draft Standard Definitions and Concepts for Dynamic Spectrum Access-Terminology Relating to Emerging Wireless Networks. Fette (ed. to appear. 14 [40] A.6. pp. pp. [64] M. [57] T. Wyglinski. 03-108. [34] M.” in Proc. Q. Nov. 2005. “Synthetic Symmetry in Cognitive Radio Networks. S. MD.” Electronic Engineering Times. 2001. W.43.” in Cognitive Radio Technology. Sherali.. Mitola III. “Policy-based network management for neXt Generation spectrum access control. Maguire Jr. T. Sousa. no.” Ph. Ireland.3). 2007. Aug. “Cognitive techniques: Physical and link layers. Nov. Rondeau. J. Livsics.2 (D4.. 1996. [55] M.” ET Docket No. Sept. Wireless Communications Principles and Practice. Germany. Broadcasting.V-684-687. no.“Techno-economic of collaborative based secondary spectrum usage-E2 R research project outcomes overview. “Perceptual evaluation of speech quality (PESQ): An objective method for end-to-end speech quality assessment of narrow-band telephone networks and speech codecs. “Comparing subjective video quality testing methodologies. Ghasemi and E.). IEEE Globecom’07. Tan. H. no. Peh and Y. Artificial Intelligence and Intelligent Systems. L. and T.” Geneva. [46] J. 2002. Li. Elsevier/Newnes. 2003. W. Reed and C. Gallagher. K.” Network World. 2005. Kokar.T.318–327.D. and application to power control. IEEE DySPAN’05. Girod. and J. Baltimore. Morales. and B. and X. France. Ding. Stoppiana. Baina. [61] T. D.” Tutorial at IEEE DySPAN’05. “Analysis of packet loss for compressed video: does burst-length matter?” in Proc. Mar.64–65. Feb. [15] B. Huang. Baltimore.. [42] J. [67] K. June 2007. vol.” in Proc. P. Menon.E.” in Proc. [31] ITU-R Recommendation SM-1046-2. Girod.” IEEE/ACM Trans. “A New Approach to Signal Classification Using Spectral Correlation and Neural Networks. “Cognitive radio: An integrated agent architecture for software defined radio. Hou. Y. pp. N. M.573– 582. Nov. 2005.” IEEE Commun. and A. 2004. [68] K. Wolf. “Cognitive radio definition and nomenclature. Berlin. M. Networking. “Methodology for the subjective assessment of the quality of television pictures. “Cognitive radios for dynamic spectrum access. Neel. 2004.

Dr. no. He also received a Certificate of Appreciation from the IEEE Computer Society in 2007. Welch and J. Dr.” in Proc.3. Worcester Professor in the Bradley Department of Electrical and Computer Engineering. cognitive networks.” in Proc. both in Electrical Engineering.S. and J. 1995. Beijing from 1997 to 1998. Yates. Bae. Dr. Principles of Digital Communication and Coding. and J. Beijing. and multimedia communications. da Silva. degree in Electrical and Computer Engineering from Polytechnic University. Chong. H. J.D. Reed. Selected Areas Commun. Jeffrey H. and S. [85] Y. pp. Y.24. Zhao. VA.darpa. degrees from Tsinghua University. He received the M. vol. Um. J. J. Moscibroda. Currently. He was a Research Member in the IBM China Research Lab. China.3. CA. Mao’s research interests include cross-layer design and optimization of wireless networks. [online] Available: http://support. VTIP:07-060. Zhao. of IEEE CROWNCOM’07. 2007. Orlando. Reed. In 2005.” Ph. 2007. Gaeddert. Reed’s area of expertise is in software radios. J. Reed served as Director of the Mobile and Portable Radio Research Group (MPRG). Dublin. Zhao. “A Survey of dynamic spectrum access . and Y. “Network support – The radio environment map. He is the co-author of a textbook.13. Reed. S. [82] Y. [72] P. degree in System Engineering and the Ph. pp. Oct. [80] Y. His book. 2007 [86] Y. 2004).and knowledge-based learning algorithms for IEEE 802. and J. Tong. VA. Zhao. “Applying radio environment maps to cognitive WRAN systems. L. H. [78] Q. no.” Virginia Tech Cognitive Radio Work Group Wiki. and a research intern in Avaya Labs-Research. S. Xiao. Zhao.. I. and J. An Introduction to Ultra Wideband Communication Systems was published by Prentice Hall in April 2005 and is considered one of the most comprehensive books on Ultra WideBand (UWB) communications.). P. and E. “Performance evaluation of radio environment map enabled cognitive spectrum-sharing networks. NJ in the summer of 2001. respectively. IEEE MILCOM’07. . From June 2000 to June 2002. N. C.S. Brooklyn.22 WRAN cognitive engines.vtip. degree in Electrical Engineering from Virginia Tech in 2007. [81] Y. pp. 1979. wireless networks and communications signal processing. no. Morales. Gaeddert. “A proposed media delivery index (MDI). P. and M.” in Proc. 19962000. China. Mao. 15 Shiwen Mao (S’99–M’04) received the B. Reed (S’78–M’80–SM’98–F’05) is the Willis G. P. [79] Q.” IEEE Signal Processing. Apr. J. vol.R. game theory. Blacksburg. B.D.S. Clark. In 2007. smart antennas. of the First IEEE Workshop on Networking Technologies for Software Defined Radio Networks. Zhao. Reston. ”A generic cognitive engine design based on radio environment map (REM). Apr. [70] “Cognitive Radio Definition. Auburn. IEEE DySPAN’07. Orlando. May 2007. Networking. James O. Dublin. “KNOWS: Knognitive networking over white spaces.php?id=192679. php?id=cognitive radio:definition. no.” in Proc. Shroff. he is an Assistant Professor in the Department of Electrical and Computer Engineering. He currently serves as Director of the newly formed umbrella wireless organization Wireless@Virginia Tech. Morales. respectively.” in Proc.210–221. 2006.mil/sto/solicitations/ WNaN/. Zhao. of IEEE WCNC’07. H. Beijing. Aug. He was the representative of China Unicom at International Telecommunication Union (ITU) third generation (3G) mobile communications standardization meetings during 1998 and 2000. K. Santa Clara.” in Proc. FL. Raymond. 2007. 2007. H. in 1994 and 1996. 11. and M. Aug.79–89. Xiao. networking. A. in 2000 and 2004. K. Narlanka. Gaeddert.org/dokuwiki/doku. K. Chou. H. [74] M. H. Dr. Elsevier. 2007. Um.1341–1347. Neel (S’99–M’07) James Neel earned his PhD in Electrical Engineering from Virginia Tech in 2006. [online] Available: http://www.” Patent Application. Chen. degrees in Electrical Engineering from Tsinghua University.25. AL. Vienna. Youping Zhao (S’04-M’07) received his Ph. A.111–115. New York: McGraw-Hill. Bae. FL. Bae. Dr.signal processing. Apr. Virginia Tech.” IEEE J. [84] Y. [71] A. “Enabling cognitive radios through radio environment maps. TCP/IP Essentials: A Lab-Based Approach (Cambridge University Press. K. Reed. Abraham Prize in the Field of Communications Systems. Wang. Zhao’s current research interests are in the general areas of signal processing and cognitive radio technologies. B. Auburn University. L. Ireland. 2003.cos. Holmdel. and a Wireless System Engineer at China Unicom.S. T.” IETF RFC 4445. VA from December 2003 to April 2006. Apr. R.org/availableTech/technology.INVITED PAPER. “A framework for uplink power control in cellular radio systems. J. and K. Bahl.” IEEE/ACM Trans. M. [75] R.7. 2007. IEEE CROWNCOM’07. S. [77] Q. James founded Cognitive Radio Technologies with Jeff Reed to advance and commercialize low complexity networked cognitive radio. During 2004 he also received an award from the SDR Forum for his pioneering 2001 publication that provides a mathematical foundation to cognitive radio based on game theory. Bae. 2001-2003. K.589–600. vol. Fette (ed.” in Proc. Mar. vol. Zhou.mprg. 2007. Virginia Tech. Omura. “Decentralized cognitive MAC for opportunistic spectrum access in ad hoc networks: A POMDP framework. “Optimization of detection time for channel efficiency in cognitive radio systems. Beijing. IEEE DySPAN’07. and K. “Overhead analysis for REM-enabled CR networks. Apr. D. Chandra. Reed. Liu. His latest book. Wu. Dr. Swami. K. Mao received the 2004 IEEE Communications Society Leonard G.com/cgi-bin/getRec?id=20070226a1. Selected Areas Commun. respectively. pp. K. NY. L. and J.S. Dissertation. “Low-complexity approaches to spectrum opportunity tracking. K. Midkiff. Sept. [76] Y. and regulatory policy.. UNDER REVIEW FOR PROCEEDINGS OF THE IEEE – SPECIAL ISSUE ON COGNITIVE RADIO [69] US DARPA WNaN Program: http://www. 2007. “Development of Radio Environment Map enabled case. He is on the Editorial Board of the Hindawi Advances in Multimedia Journal and the Wiley International Journal of Communication Systems. Sept. He received his B. Viterbi and J. Ireland. Ferrell. Wang. the theory and implementation of software radio. J. [83] Y. http://fedbizopps. Dr. FL. P. Orlando. J. Le. Zhao. B. Krishnamachari. [73] J. Reed became Fellow to the IEEE for contributions to software radio and communications signal processing and for leadership in engineering education. He is currently a Principal Research Engineer at Shared Spectrum Company. He has been a Research Scientist in the Bradley Department of Electrical and Computer Engineering. Reed. and processor design. Sadler.” in Cognitive Radio Technology. 2006. Hong Kong.2. L. China in 1994 and 1997. 2006. Zhao and B.D.” IEEE J. Software Radio: A Modern Approach to Radio Design was published by Prentice Hall in May 2002. He was a Senior System Engineer at LinkAir Communications. Morales. Zhao. His research interests include the analysis and design of networked cognitive radio algorithms. Yuan. B. L. pp. “A Utility-Based PowerControl Scheme in Wireless Cellular Systems. Reed received the College of Engineering Award for Excellence in Research in 2001.