You are on page 1of 52

IAB Online Ad Measurement Study

December 2001

The Interactive Advertising Bureau (IAB), the Media Rating Council (MRC) and the Advertising Research
Foundation (ARF), along with PricewaterhouseCoopers are pleased to present the Online Advertising
Measurement Study. This report aggregates data and information compiled by PricewaterhouseCoopers from 11
participating companies identified and selected by the IAB.
The 11 participating companies include a cross-section of the key industry players -- portals, destination sites, ad
networks and third-party ad servers -- which combined represent nearly two-thirds of total industry revenues.
3rd Party Ad Networks / Ad Servers
Avenue A
DoubleClick
Engage

Portals

America Online
Lycos
MSN
Yahoo!

Destination Sites

CNET
Forbes.com
New York Times Digital
Walt Disney Internet Group

PricewaterhouseCoopers would like to thank all of the participants and their companies for their contribution of
valuable time spent in the development of the IAB Online Ad Measurement Guidelines. We appreciate the
support that the participants provided us to complete this study.
The overall purpose of this project is to determine the general source of measurement and reporting differences
between the aforementioned industry players. The project findings in this report include a summary of those
discrepancies found and contributing factors, in addition to recommendations designed to address the
discrepancies in measurement and reporting of online advertising data.
This study was conducted independently by PricewaterhouseCoopers on behalf of the IAB. The report
aggregates the findings generated from over 60 interviews with management responsible for key aspects of the
online advertising process, in addition to validating leading online advertising metrics against definitions via
scripted testing. This consulting report does not represent an audit as described by the AICPA.
The IAB/MRC/ARF have reviewed the results and proposed recommendations, subsequently worked with the 11
participants and other key constituents (e.g., AAAA) to develop a set of Industry Measurement Guidelines. The
Guidelines represent a consensus of acceptable practices, and will be distributed to the industry at large on
January 15, 2002 to adopt on a voluntary basis.
2001 PricewaterhouseCoopers. All rights reserved.
2

Table of Contents

Background and Scope

Page
4

About this Report....

Detailed Findings
Metrics & Definitions...
Processes & Controls....

6
33

Recommendations
Metrics & Definitions...
Processes & Controls....

44
47

Appendix..

49

2001 PricewaterhouseCoopers. All rights reserved.

Background and Scope


The primary objectives of the IAB Online Ad Measurement Study were to:

review the current measurement criteria and practices used by a representative group of sell-side
companies for online advertising and audience measurement reporting

document and report the comparability of existing metrics used by the industry

propose a common set of industry definitions and guidelines for data analysis and reporting

The scope of work consisted of:

coordinating and working with 11 participating companies identified and selected by the IAB
three 3rd party ad servers / ad networks
four destination sites
four portal sites

scheduling and conducting on-site interviews with appropriate business unit management to
understand and document each participants online advertising measurement and reporting system:
what types of audience and advertising data are measured
how the data is measured and how it is reported
performing scripted testing to assess whether the participating companys collection and reporting
systems track audience and advertising metrics in accordance with the companies definitions
determining discrepancies between definitions, editing procedures and reporting, and follow-up on
testing issues

Our work focused on specific metrics and ad formats:

a comprehensive list of audience measurement and ad delivery metrics was documented

scope of this study focused on the five metrics identified on page 7 of this report

five metrics were tested, with an emphasis on ad impressions and clicks


4

2001 PricewaterhouseCoopers. All rights reserved.

Background and Scope -- About this Report


A fundamental premise of this report is that in order to achieve reliable, accurate and
comparable ad campaign measurement reporting, there must exist a set of standardized
metric definitions that are applied to a well-controlled process.

Standard Metric
Definitions

Well-Controlled
Process

Reliable Ad
Campaign
Measurement
Reporting

Refer to pages 33-41 for additional information on the concept of how a well-controlled
process manages the risks inherent in high-volume, low-dollar transactions -- hallmarks of
online advertising management systems.
This study focused on specific audience measurement and advertising metrics. There are,
however, many other metrics and related research aspects that were not addressed, but can
be identified through the work of various industry organizations (e.g., MRC Minimum
Standards for Media Rating Research).

2001 PricewaterhouseCoopers. All rights reserved.

Detailed Findings: Metrics, Definitions, Audits


Online Ad Measurement Study

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Five Primary Metrics Measure and Track Audience and Ad Transactions


Top Metrics

Leading Ad and Audience Measurement Metrics

Based on responses from participants, the top five


metrics used consistently for ad delivery reporting
and audience measurement include the following:

11

11

11
10

10

10

Ad Impressions

Clicks

8
8

Unique Visitors
# Participants

Total Visits
Page Impressions
Although other metrics exist, they are not uniformly
defined or utilized. For example:
time spent on page
number of completed user registrations
conversions

6
5
4
3
2
1
0
Ad
Impressions

Clicks

Unique
Visitors

Total Visits

Page
Impressions

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Ad Impressions are the Dominant Currency Metric for Ad Revenues


Ad Impressions are the dominant Currency Metric
Currency Metrics
While all five of the key metrics are tracked and reported,
Ad Impressions are the dominant currency metric (metric
upon which revenue-generating contracts are based).

11

Clicks as a Currency Metric

10

Only participants using a Cost-per-Action pricing model


base contract revenues on click results. Three of eleven
participants use Cost per Action.

11

9
8
7

# Participants

Page Impressions - Secondary Currency Metric


In addition to ad impressions, one participant uses page
impressions to track transactions where advertisers pay for
embedded content or sponsor a page.

6
5
4
3

Other Currency Metrics

Other less common currency metrics include email ad


metrics (e.g., subscribers, messages delivered, message
opened) and ad metrics based on user actions (e.g.,
conversions and referrals).

2
1
0

Unique
Visitors

Total Visits

0
Ad
Impressions

Clicks

Page
Impressions

2001 PricewaterhouseCoopers. All rights reserved.

Ad Impressions

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Ad Impression Measurement: Server Initiated and Client Initiated


The measurement of an ad impression transactions requested by a server or a client (browser) is tied to the process
used to request the ad. Listed below are definitions of the two ad measurement processes that result from server or
client initiated ad impression requests.
Server Initiated Measurement
This process occurs when a web server, prior to serving a web page to a user agent request (browser, robot,
other), builds the web page with links to an ad resource (image/asset server, internal ad server, 3rd party ad server),
and records an ad impression transaction. The ad impression transaction is recorded (via logs on the web server,
or logs/real-time on an internal ad server) prior to serving the requested web page to the user agent.
Client Initiated Measurement
Occurs when the measurement of an ad impression is the result of a direct connection between the user agent
(browser, robot, other), and the ad server. This process can take two forms:
1. Impressions served via advanced html tags (IFRAME/Javascript/ILAYER tags). In this case the ad server
(typically) records an impression transaction and responds to the user agent with the contents of the selected
creative, which may include html that refers to image/assets, or html that refers to another ad resource (3rd
party server).
2. In some cases the ad impression transaction is recorded via an independent request (via an HTTP 302
redirect) to a special ad transaction logging server. This independent request may utilize web beacon
technology and is initiated by the user agent at the same time it requests the image/rich media from the
image/asset server.

10

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Ad Impressions: Server Initiated vs Client Initiated Ad Requests


The delivery processes listed below are for requesting ad
impressions only.

Process for Tracking Impressions

Server Initiated Ad Request


11

Five of eleven participants use a server initiated


process to request the majority of ads.

10
9

Client Initiated Ad Request


Seven of eleven participants use a client initiated
process to request the majority of ads.

7
7

# Participants

One participant uses both client and server initiated


approaches depending on where the ad is served in the ad
management system.

6
5
5
4
3
2
1
0
Server Initiated

11

Client Initiated

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Narrative

Well-Controlled
Process

Reliable
Reporting

Server Initiated Ad Request and Counting Process

1. Browser calls Publisher Web Server.


3 (ad response)

2. Web Server calls Publisher Ad Server .

2 (ad request)

2a. Impression counted at the Publisher' Ad Server (See Type 1


on chart) when the ad request occurs.
3. Publisher Ad Server responds with a reference to an ad asset
(image, video, audio, etc) on an Asset or Image Server. The
reference may also be a link to a Third Party Ad server.

Publisher Ad Server

Publisher Web Server


4a (Web Server: Ad request counted)

2a (Ad Server: ad request counted)

4. Publisher Web Server responds with HTML content including


embedded ad content from Publisher Ad Server and any ad
assets (image, audio, video) stored on the Publisher Web Server

1 (content request)
4 (content response)
Type 2:
Publisher
Logging on
Web Server
after retrieving
Ad

Type 1:
Publisher
Logging
on Ad
Server

4a. Impression counted at the Publisher Web Server


(See Type 2 on chart) when the HTML content is prepared, but
after the Publisher Ad Server response.

Browser
5c (ad response: 302)

5 (asset request)

6 (asset response)

5. If the ad involves a remote asset (i.e. image, audio, video not


located on the web server), the browser requests asset from the
asset server.
5a. If the Publisher Web Server response includes a link to a
Third Party Ad Server, the Browser will request the ad from the
Third Party Ad Server.
5b. The Third Party Ad Server will record the ad request.
5c. The Third Party Ad Server will respond with a 302 redirect or
HTML to an Asset Server.

5a (ad request)

Third Party Ad Server


5b (Third Party
ad request counted)

6. The Asset Server responds to the asset request by sending the


asset back to the browser.

Asset or Image Server

Third Party
Logging

12

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Server Initiated Process: Majority of Participants Count at the Ad Server


Five of eleven participants use a Server Initiated Ad request
process. The ad request process that a participant uses
determines in part where the participant counts ad
impressions. The statistics listed below are for counting ad
impressions only.

Server Initiated Transaction Logging Point

11
10

Ad Sever Logging
Two of the four participants using a server initiated ad
request process count ad impressions at the ad server, after
receiving a request from the web server, but prior to
rendering the content.
Web Sever Logging
One of the five participants counts the ad impressions at the
web server prior to rendering the content, after the ad server
responds to the web server request (illustrated as Type 2 on
the Illustrative Server Initiated Ad Request diagram on
page 12).

# Participants Using Server


Initiated Ad Request Process

9
8
7
6
5
4
4
3
2
1
1
0
Ad Server

13

Web Server

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Narrative

Well-Controlled
Process

Reliable
Reporting

Client Initiated Ad Request and Counting Process


Type 1: Publisher
Ad Logging prior to
Browser requesting
Ad Asset

1. Browser (user agent) calls the Publisher Web Server.


2. Publisher Web Server responds with HTML content including a
reference to make a request to the Publisher Ad Server.

3a (ad request counted)

Third Party
Logging

3. Browser parses the HTML from the Publisher Web Server and
makes secondary calls to the Publisher Ad Server (usually IMG/
IFRAME SRC/ILAYER/SCRIPT SRC tags)

Type 1:
Publisher
Logging

Publisher Ad Server
5b (Third Party
ad request counted)

3 (ad request)

3a. Type 1: Publisher Ad Server records the ad impression prior to


Browser requesting ad asset.

4 (ad response: 302 or HTML)

4. Publisher Ad Server responds to Browser with a 302 redirect (if an


IMG SRC Tag) or HTML.

Publisher Web Server


1 (content request)

5a (ad request)
Third Party Ad Server

5. Browser requests asset from Asset Server.

2 (content response)

5c (ad response: 302)

5a. If Publisher' Ad Server responds with a link to a Third Party Ad


Server, the Browser will request the ad from the Third Party Ad Server.
Browser

5b. The Third Party Ad Server records the ad request.

5 (asset request)

5d (ad request to Counting Server)


6 (asset response)

5c. Third Party Ad Server responds to the Browser with a 302 redirect (if
an IMG SRC Tag) or HTML to the Asset Server.

5f (ad response: 302)

5d. A Publisher may record an ad impression at the same time the


image is rendered by the Browser by issuing a request to a Publisher Ad
Counting Server using either a web beacon (for rich media ads) or an
image call to a portion of an ad.

5e (ad request counted)

Type 2:
Publisher
Logging

Asset or Image Server

Publisher Ad Counting Server


Type 2: Publisher
Ad Logging
simultaneously to
Browser requesting
Ad Asset

5e. Type 2: Publisher Ad Counting Server records the ad impression


simultaneously to Browser requesting ad asset.
5f. Publisher Ad Counting Server responds to the Browser with a 1x1
transparent image (web beacon) or an image call to a portion of the ad.
6. Asset Server responds to the asset request from the Browser with
the image or rich media content.

14

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Client Initiated Process: Majority of Participants Count at the Ad Server


The ad request process that a participant uses determines in
part where the participant counts ad impressions. The
statistics listed below are for counting ad impressions only.

Client Initiated Transaction Logging Point

11
10

Counting in a Client Initiated Ad Request Process

One participant counts ad impressions at a separate ad


logging server, after the ad server responds to the client, and
when the client initiates a separate redirect call to the ad
logging server.

# Participants Using Client


Initiated Ad Request Process

Six of the seven participants using a client initiated ad


request process count ad impressions at the ad server, after
receiving a request for an ad by the client.

8
7
6
6
5
4
3
2
1
1
0
Ad Server

15

Separate Ad Logging Server

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

All Participants Support the Use of Cache Busting

Types of cache busting technology utilized include:


Appending a random number to the end of the
ad request.
Appending a time/date stamp to the end of the
ad request.

Third party ad serving firms provide cache-busting


guidelines to websites that do not have cache busting
capabilities.

Cache Busting Technology Utilized

11

11

10
9

# Participants Supporting Cache


Busting Functionality

All eleven participants support the use of cache busting


technology. Cache busting mechanisms are employed
to the reduce the potential for an ad request to be
cached in either a web browser or a proxy server.
Cached ads result in undercounting impressions,
because the impression is being served from a proxy or
browser cache, rather than an ad server.

8
7
6
5
4
3
2
1
0
0
Support Cache Busting

16

No Support for Cache Busting

2001 PricewaterhouseCoopers. All rights reserved.

Clicks

17

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Clicks: Uniform Definition and Use of 302 Redirects


All eleven participants track clicks and have a
consistent approach to the definition.

Participants Using 302 Redirects for Tracking Clicks

Uniform Definition
11

The definition of the click metric is the most


consistently accepted and applied of the five key
metrics.

11

10
9

A click is a user-initiated action of clicking on an ad


element, causing a re-direct to another web location.
A click does not include information on whether or
not the user completed the redirect transaction.

# Participants

7
6

Use of 302 Redirects

All participants base click metrics on redirects (or


transfers) successfully processed by the ad server.

4
3
2
1
0
0
Click tracking via 302 redirects

18

Not tracking Clicks via 302


redirects

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Click Request and Counting Process


Narrative

Publisher
Click
Logging

1. Browser clicks on an ad which causes browser to request


a target site from the Publisher Ad/Click Transaction Server.
The target site URL is typically included in the request.

1a (click request counted)


Publisher Ad/Click Transaction Server

1 (click request)

1a. Ad/Click Transaction Server records the click.

2 (click response: 302)

2. Ad/Click Transaction Server responds to the Browser with


a redirect (HTTP 302) to Target Site location.

3a (click request)

3. Browser follows redirect to Target Site.


3a.In the case of a third party-served ad, the target site
location is actually that of the Third Party Ad server.
3b. The Third Party Ad/Click Server records the click request.
3c. The Third Party Ad/Click Server responds with redirect to
the Target Site.

3c (click response: 302)


Third Party Ad/Click Server

3b (Third Party
click request counted)

Browser

3 (target site request)


4 (target site response)

4. Target Site server responds to the Browser.

Third Party
Click
Logging
Target site

19

2001 PricewaterhouseCoopers. All rights reserved.

Unique Visitors

20

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Unique Visitors: Cookie and Registration Based Methods Utilized


Ten of eleven participants track unique visitors. There are two
primary methods used to track each unique visitor:

Methods for Tracking Unique Visitors

Cookie Based Definition


Eight of the ten participants tracking unique visitors use
cookies, and two of the eight also use IP Address in addition to
cookies. Participants typically distinguish between recurring
cookies (repeat visitors) and new cookies (new visitors or
repeat visitors that delete cookies).

10
9
8
8
7

Registration Based Definition


# Participants

Two of the ten participants tracking unique visitors use


registered users or user login counts.

5
4
3
2
2
1
1
0
Cookie-Based

21

Registrations

Not Tracking Unique


Visitors

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Cookie Based Unique Visitors: Process for Determining New Visitors


Among the eight participants tracking visitors using cookies,
there are different techniques used to determine if a new
cookie should be considered a new visitor.

Process for Determining New Visitors

10

Count All New Cookies


Four of the eight participants count all new cookies as new
visitors. Two of the eight also use IP Address in addition to
cookies for additional user validation.
Exclude All New Cookies

9
8
7

One participant does not count any new cookies as a new


visitor. In this case, a unique cookie must visit the site at least
twice to be considered a new visitor.
Exclude Some New Cookies Based On Historical Data

# Participants

6
5
4

4
3

3
2

Three of the eight participants attempt to estimate the number


of repeat visitors with new cookies using known user data. For
example, a portion of new cookies from the unique visitor
count is excluded based on an estimate of new cookies that
represent repeat visitors that do not accept cookies, or have
deleted cookies from previous visits.

22

1
1
0
Count All New
Cookies

Exclude All New


Cookies

Excludes Some New


Cookies based on
Historical Data

2001 PricewaterhouseCoopers. All rights reserved.

Total Visits

23

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Calculating Total Visits: Based on Actual, Sample or Statistical Analysis


Ten of the eleven participants calculate total visits. The
specific business rules used to define a visit (or
session) vary with most of the participants. For
example, some participants use a time based attribute
for terminating a visit after 30 minutes of inactivity. In
addition to time-based rules, there are three methods
used to calculate total visits.

Use of Actual, Sample or Statistical Analysis


to Determine Total Visits
10
9
8

Actual

Six of ten participants use all of the user activity data to


calculate total visits.

Sampling

# Participants

5
4

Three of ten participants use a sample (several days


during the period) of user activity to estimate total visits.
Some participants rely on outsourced service providers
for this measurement.

Statistical Analysis

2
1
1

Actual Data

One participant performs statistical analysis to estimate


total visits.

24

Sampling Days or
Periods

Statistical Analysis

2001 PricewaterhouseCoopers. All rights reserved.

Page Impressions

25

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Page Impression Tracking


The page impression metric is used by eight of the
eleven participants; the 3rd party ad server participants
do not track this metric.

Page Impression Logging Points

11

Publisher participants may use this metric if contentbased advertising is embedded within a page or a page
is sponsored by an advertiser.
Page impressions are tracked using two different
methods:

10
9
8

Logging at the Web Server

# Participants

7
6

Six of the eight participants use standard web server


logs for page impressions. Page Impression
transactions are usually only counted if accompanied by
successful HTML status codes and are filtered from
robotic activity.
Logging at a Separate Tracking Server

5
4
3
2
2
1

Two of the eight participants use web beacon


technology to track page impressions. These
participants utilize either 3rd party web metric tracking
firms or internal tracking servers to record the request.

26

0
Web Server

Separate Tracking Server

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Page Impression Delivery and Measurement Process


Data

Web Server

(1) The user browser calls the web server using a


URL entered by the user (may be initiated by the
user or via an automated page refresh).

(1)

(2) The web server responds to the user browser


by checking for an existing cookie and creating
one if one does not exist. The web server then
renders the content and web beacons if beacons
are used for external tracking.

(2)

No participants measure page impressions at this


point.

Six of eight publisher participants measure


page impressions at this point.
User Browser

(3) Where web beacons are used for tracking


purposes, the users browser calls the ad server to
request an invisible tracking image (i.e. one clear
pixel) and passes on the users cookie along with
the beacon request.

(3)

(4)

(4) Where web beacons are used for tracking


purposes, the ad server responds to the tracking
request by rendering and logging the tracking image.
Two of eight publisher participants measure page
impressions at this point.

No participants track page impressions at this


point.

Data

Note: 3rd party servers may track page impressions


at this point as part of specific Cost per Action
contracts.

Separate Tracking Server

27

2001 PricewaterhouseCoopers. All rights reserved.

Filtering

Robots/Spider Activity

Filtered Log Data

Data

Internal IP Addresses

28

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

All Participants Perform Some Level of Robotic Activity Filtering


Filtering for robotic activity falls into three categories:
Robotic Filtering Approaches

Basic
All eleven participants perform some basic robotic activity
filtering. Basic filtering techniques include:
Use of a robot.txt file to prevent well-behaved
robots from scanning the ad server
Exclusion of transactions from User Agent
Strings that are either empty and/or contain the
word bot

11

11

10
9
8
8
7

Eight of eleven participants also exclude transactions


from lists of known robots. The lists are typically based
on User Agent String and/or IP Address and may be
maintained by a 3rd Party Auditor. The number of
identified robots each participant lists varies from
approximately ten to over seven hundred.

# Participants

List of Known Robots

5
4
3
2
2
1

Behavioral Filtering
0

Two of eleven participants also conduct advanced


behavioral filtering. These participants define business
rules, such as 50 clicks by a single cookie during a daily
period, to identify robotic behavior.

29

Basic Filtering

List Filtering

Behavioral Filtering

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Filtering of Internal IP Addresses


Four of eleven participants filter transactions from
within their company by removing all activity
originating from IP Address ranges on their company
network.

Internal IP Address Filtering

11

The reasons for filtering this activity include:

10

- Eliminating any activity generated by internal


monitoring tools. These tools are similar to
robots and are often used to verify that a
server is working properly.

9
8
7

# Participants

7
6

- The demographic associated with a user


within the company does not represent the
primary demographic of the website.

5
4

4
3

- Removing all activity generated by a


company user conducting testing within the
live environment to ensure that a creative is
being served properly.

2
1
0
Removes Internal IP Addresses

30

Includes Internal IP Addresses

2001 PricewaterhouseCoopers. All rights reserved.

Independent Verification Audits

31

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Independent Verification
Nine of the eleven participants use 3rd Party firms to conduct
independent audits over the ad delivery processes or individual (or
campaign) transactions. Two of the eleven participants do not
employ independent audits.

Types of Audits Conducted by Participants

Process Audits

11

Five of the eleven participants employ outside auditors to conduct


process audits over the entire transactional process used to generate
online ad metrics. These audits are conducted periodically
(e.g.,every 6 months) and result in a audit opinion (performed under
AICPA standards) over the effectiveness of controls and processes in
place.

10

Seven of the eleven participants employ outside firms to re-count their


transactional data and provide verified metric reports to advertisers.
These activity reports are typically produced monthly.

8
7

# Participants

Activity Audits

7
6
5
5
4
3
2
2

Both Process and Activity Audits

Three of the nine participants conduct both process and activity


audits.

0
Process Audit

Activity Audit

No Audit Performed

No audits
Two of the eleven participants do not employ independent verification.

32

2001 PricewaterhouseCoopers. All rights reserved.

Detailed Findings: Processes & Controls


Online Ad Measurement Study

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

The Online Advertising Process


Good Controls Result in Reliable Reporting
The diagram on following slide illustrates the basic process involved with selling,
delivering and reporting online advertising.
Slide 36 describes the concepts of how a well-controlled framework is the foundation
for assessing control risks associated with managing and delivering online advertising,
as well as reporting accurate, complete and reliable data.
Slides 37- 42 step through the online advertising process, beginning with the sales
insertion order and ending with campaign reporting. Each of these slides describe the
expected controls involved to successfully complete that component of the process.
Adjacent to the expected controls content box is the content box highlighting what was
actually observed through inquiry with the eleven participants, including how many
participants did not have the requisite controls in place to effectively manage that
component of the process.
The resulting gap analysis from this benchmarking exercise provides the evidence
supporting our findings and proposed recommendations for achieving reliable and
comparable reporting metrics.
Standard Metric
Definitions

Well-Controlled
Process

34

Reliable
Reporting

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

A Simplified Overview of the Online Advertising Process


>>

Sales

Order Processing
Agreement on advertiser
campaign requirements and
sales terms in Insertion Order

Capture of Insertion Order


detail in the delivery
system
Trafficking
Establishment of creative in
the delivery system

Area most critical to traffic metrics


Delivery
Delivery and recording of ad
and traffic activity by the
delivery system

Advertisers

Reporting
Summarization and presentation of
delivery or traffic data to external
parties

35

Data Aggregation
Collection and aggregation of delivery
and traffic data from the delivery
system logs into the reporting system

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Detailed Steps Involved in the Reporting Process


1) Ad and/or traffic activity is captured in
the server log or real time
2) Transaction Data is collected from
servers.

(1)
Capture

8) Metric data is presented to


the appropriate parties.
(8)
Presentation

(2)
Collection
3) Logs are reformatted
and/or sorted without
altering values.

7) Metric data from separate


systems is combined to
determine the total metric.

(7)
Compilation

(3)
Formatting

(6)
Extrapolation
6) If data sampling is used for audience
measurement, results are extrapolated
to the entire population of data.

(4)
Filtering

4) Logs are filtered to exclude invalid


entries.
-Robotic Activity
-Internal IP addresses
-Automated page refreshes

(5)
Summarization

5) Data is summarized and metric results


are calculated following company
established definitions (see Appendix).

36

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Poorly Designed Controls Have Serious Consequences


Process Category
!

Sales

Standard Criteria
Do I/Os accurately reflect
advertiser specifications?

Key Risks / Consequences


!

Manual data entry errors

Inaccurate sales records

Order
Processing

Are ad campaigns loaded


as specified in the I/O?
!

Trafficking

Ad Delivery

Are campaigns trafficked


to the appropriate server?
Are ads being served on
schedule to meet campaign
commitments?

Data
Aggregation

Are all ad transactions


completely and accurately
written to the logs?
!

Campaign
Reporting
PricewaterhouseCoopers

Are logs from all servers


consolidated for report
generation?

Invalid insertion orders


being processed
Wrong ad creative loaded
for delivery
Improper ad modifications
entered in the system
Ad deliveries not recorded
in the proper period
Log files for ads delivered
not reconciling with ads
reported
Over reporting of ad
transactions

37
Key Control Objectives -- Completeness, Accuracy. Validity & Restricted Access
2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Sales Process Cumbersome Due to Lack of Standard Insertion Order


Expected Controls

Observed Weaknesses

VALIDITY

VALIDITY

Formal written, consistent I/O is used for all ad


sales

Format and content of Insertion Order varies by


participant.

Sales are recorded only after a valid sales order


or contract has been properly authorized.

Initial Insertion Orders are typically signed by the


advertiser before the campaign is established, but
revision Insertion Orders are often not signed
before processing.

Changes made to existing sales contracts are


properly authorized by the customer and
reviewed by appropriate company personnel.

COMPLETENESS & ACCURACY


Revision Insertion Orders are not always linked to
the original Insertion Order.

COMPLETENESS & ACCURACY


Controls exist to ensure that valid sales are
recorded once and only once.

RESTRICTED ACCESS

Appropriate personnel review Insertion Orders


for accuracy after they are established in the
system.

Security controls vary by participant.

RESTRICTED ACCESS
Appropriate Access to Insertion Order data is
restricted to appropriate users.
Sales personnel do not have the ability to
execute campaigns in the delivery system.

38

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Order Processing Controls Weaker for Revised Insertion Orders


Expected Controls

Observed Weaknesses

VALIDITY

VALIDITY

Approval of the detailed Insertion Order


should be obtained from both the client and
internal management.

Insertion Order format and content vary by


participant.
The controls surrounding Insertion Order
revisions are universally weaker than for the
controls surrounding initial Insertion Orders.

A clear policy and enforcement mechanism


should exist for managing Insertion Order
revisions.
COMPLETENESS & ACCURACY

COMPLETENESS & ACCURACY

Sales should create or review the Insertion


Order directly in the delivery system to avoid
errors due to manual data entry.

Many of the participants manually enter the


Insertion Order data into the delivery system
from the Insertion Order.

The order system should perform validation


on data inputs including dates, guaranteed ad
impressions, etc. to prevent errors.

Many of the participants order processing


systems do not perform validations on key
fields.

A standard Insertion Order should be used for


all ad sales.

RESTRICTED ACCESS

RESTRICTED ACCESS

Security controls vary by participant.

Access to input data should be restricted to


appropriate individuals.

39

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Management of General System Risks Often Informal


Expected Controls

Observed Weaknesses

SYSTEM MAINTENANCE

SYSTEM MAINTENANCE

System changes are appropriately approved.

Change management policies and


procedures are often informal.

Systems changes are adequately tested.


Access to production is restricted to
appropriate individuals.
SYSTEM SECURITY

SYSTEM SECURITY

Physical access is appropriately restricted.

Strength and nature of controls vary


significantly by participant and by system.

Logical access is appropriately restricted.


SYSTEM OPERATIONS
A disaster recovery plans exists.

SYSTEM OPERATIONS

Data is backed up on a regular basis.

Many participants do not have a functional


disaster recovery plan

System performance/availability is monitored.


SYSTEM DEVELOPMENT
SYSTEM DEVELOPMENT
Projects are authorized and tracked.
Control requirements are often left out of the
development process

Requirements are appropriately defined.


Controls are considered during development

40

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Security Controls Vary by Participant


Expected Controls

Observed Weaknesses

VALIDITY

VALIDITY

All log generating servers are configured


consistently, ensuring that proper naming
conventions and logging is implemented for
all servers.

New servers are sometimes deployed and


configured inconsistently.

COMPLETENESS & ACCURACY


COMPLETENESS & ACCURACY
Cache busting techniques are employed to
ensure the capture of all user activity.

Participants do not all support cache


busting techniques.

Controls exist to ensure the delivery system


logs activity once and only once.
RESTRICTED ACCESS

Controls exist to ensure that the delivery


system logs activity in the proper period.

Security controls vary by participant.

Campaigns are paced to achieve targets.


RESTRICTED ACCESS
Access to system resources and data is
restricted using user groups and
passwords.

41

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Data Aggregation Exclusions Not Consistent


Expected Controls

Observed Weaknesses

VALIDITY

VALIDITY

Error codes are excluded from metrics.

Robot filtering is not consistently used and


involves various techniques.

Non-user initiated activity (e.g. robots,


spiders) is excluded from summarized
activity.

Internal IP addresses are typically not


filtered out of aggregated data.

Internal IP Addresses are excluded from


aggregated data.

COMPLETENESS & ACCURACY

Automated checks identify corrupt or


suspect data

Data retention policies differ by participant


and may not satisfy audit or contractual
requirements.

COMPLETENESS & ACCURACY

Participants are occasionally forced to


restate previously reported activity due to
corrupt or lost data.

Collection controls ensure all logs are


collected once and only once.
An audit trail is maintained to report the
number of files aggregated and lists the files
that could not be processed.

Data recovery capabilities differ by


participant.
Timing constraints or latency may cause
reporting discrepancies.

RESTRICTED ACCESS
Access to logs is restricted to appropriate
personnel to prevent tampering, loss or
destruction.

RESTRICTED ACCESS
Security controls vary widely by participant.

42

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Multiple Systems and Terms Increase Reporting Risk and Complexity


Observed Weaknesses

Expected Controls
VALIDITY

VALIDITY

Participants do not always use the terms and


definitions included the Insertion Order for
reporting purposes.

Reporting is based on the same standard


metric definitions used during the sales
process and included in the Insertion Order.

Estimates used in reporting vary and are not


consistently validated for reasonableness and
accuracy.

Estimates used in reporting are based on


standard criteria and are periodically
validated for reasonableness and accuracy.
COMPLETENESS & ACCURACY

COMPLETENESS & ACCURACY

Report functionality ensures that all data is


summarized and presented for the specified
period and advertiser.

Participants generally report the data only for


the specified period with some exceptions.
Advertisers are generally notified when
reporting data is available.

Advertisers are notified when reporting data


is available.

RESTRICTED ACCESS
RESTRICTED ACCESS

Most participants allow advertisers direct


access to reports via a web interface;
however, some participants continue to
compile and distribute reports manually from
single or multiple data sources

Advertisers have direct and timely access


to their data and only to their data.

43

2001 PricewaterhouseCoopers. All rights reserved.

Proposed Recommendations:
Metrics & Definitions
Online Ad Measurement Study

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Proposed Recommendations for Ad Delivery & Audience Measurement


Metrics

Adopt standard definitions for the following metrics (include specific exclusions /inclusions and estimation criteria
in these standard definitions):

Primary

1. ad impression -- Consider a definition that includes best practice filtering and logging attributes. Proposed
definition: A measurement of responses from an ad delivery system to an ad request from the user
browser, which is filtered from robotic activity and is recorded at a point as close as possible to the actual
viewing of the creative material by the user browser.
2. click (through) -- Consider a definition that includes best practice filtering and logging attributes. Proposed
definition: A measurement of the user-initiated action of clicking on an ad element, causing a re-direct to
another web location. Tracked and reported as a 302 redirect at the ad server. This measurement is filtered
for robotic activity and is recorded at a point as close as possible to the actual viewing of the destination web
location by the user browser.
3. total visits Resolve whether the approaches to determining visitor counts can be addressed in one
definition (i.e., cookies, user-registration) and require disclosure of the definition. Resolve whether session
time limits should also be included in the definition(s).
4. unique visitor After resolving the two issues related to the visitor definition, consider the additional issues
for defining unique visitors, including the use of sampling and estimates, and the treatment (i.e., include or
exclude visitors that do not accept cookies) of new cookies for cookie-based calculations.
5. page impressions -- Consider a definition that includes best practice filtering and logging attributes.
Proposed definition: A measurement of responses from a web server to a page request from the user
browser, which is filtered from robotic activity and error codes, and is recorded at a point as close as
possible to the actual viewing of the page by the user browser.

45

2001 PricewaterhouseCoopers. All rights reserved.

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reporting

Secondary

Primary

Proposed Recommendations for Ad Delivery & Audience Measurement


Metrics
Disclose the recording technique used for all ad delivery and audience measurement metrics using
the agreed-upon industry definitions. Include suggested items to be disclosed such as time
definitions, type of metric used, what is excluded and when you change your processes.
Use a different term for estimated reach calculations using a sample of user activity from the
entire population.

Establish definitive periods for reporting (e.g., day, week, month, four-week period)
Consider adopting standard definitions for additional formats (e.g., email, rich media)

46

2001 PricewaterhouseCoopers. All rights reserved.

Recommendations: Processes & Controls


Online Ad Measurement Study

Standard Metric
Definitions

Well-Controlled
Process

Reliable
Reportin
g

Primary

Proposed Recommendations for Processes and Controls


Develop standard for cache busting practices, including responsibility of respective parties in a
3rd party serving arrangement.
Standardize exclusions such as filtering for robots/spiders and auto page refreshes.
Require inclusion of general system technology controls (i.e. security, data backup/retention
procedures, change control practices, etc) and other process controls .

Secondary

Encourage the use of independent third parties to perform periodic reviews of compliance with
the underlying processes used to generate reporting information.

Adopt a standard industry insertion order, including format, content, and standard terms &
conditions.
Establish a clear policy and process for limiting and managing revision insertion orders

48

2001 PricewaterhouseCoopers. All rights reserved.

Appendix
Online Ad Measurement Study

Appendix: Project Team


The study team included professionals specializing in the online advertising industry;
study advisors and contributors were drawn from a variety of organizations
Study Team
PricewaterhouseCoopers
Engagement Partners:

Tom Hyland, Russ Sapienza

Project Directors:

Suzanne Faulkner, Pete Petrusky, Troy Skabelund, Matt McKittrick

Team Members:

Michael Hulet, Chad Fisher, Jee Cho, Justin Wright,


Nic Pacholski, Brianna Sorenson, Ying Li

Study Advisors / Contributors


Interactive Advertising Bureau
Greg Stuart

Media Rating Council


George Ivie

Advertising Research Foundation


Jim Spaeth

Robin Webster

50

2001 PricewaterhouseCoopers. All rights reserved.

Appendix: Organizational Profiles


The report has been conducted independently by PricewaterhouseCoopers on behalf of IAB.

PricewaterhouseCoopers (www.pwcglobal.com), the worlds


largest professional services organization, helps its clients build
value, manage risk and improve their performance. Drawing
on the talents of more than 150,000 people in 150 countries,
PricewaterhouseCoopers provides a full range of business
advisory services to leading global, national and local
companies and to public institutions.
PricewaterhouseCoopers New Media Group was the first
practice of its kind at a Big Five firm. Currently located in New
York, Los Angeles, Boston, Seattle and the Bay Area, our New
Media Group includes accounting, tax and consulting
professionals who have broad and deep experience in the
three areas that converge to form New Media: advanced
telecommunications, enabling software and content
development/distribution.

management consulting

M&A assistance

business assurance services

tax planning and compliance

Web advertising delivery


auditing

capital structuring

privacy auditing and


consultation

employee benefits and


executive compensation
packages

51

Founded in 1996, the Interactive Advertising Bureau (IAB)


is the leading online global advertising industry trade
association with over 300 active member companies in the
United States alone.
IAB activities include evaluating and recommending
standards and practices, fielding research to document the
effectiveness of the online medium and educating the
advertising industry about the use of online and digital
advertising.
Current membership includes companies that are actively
engaged in the sales of Internet advertising (publishers),
with associate membership including companies that
support advertising, interactive advertising agencies,
measurement companies, research suppliers, technology
providers, traffic companies and other organizations from
related industries.
The IAB is an expanding global organization with certified
International chapters and Corporate members in Asia and
the Far East, North and South America, Europe and
Eastern Europe, South Africa and The Caribbean.
2001 PricewaterhouseCoopers. All rights reserved.

Appendix: Participating Organizations


The following companies participated in the study, selected by the IAB:

Destination Sites

Portals

52

3rd Party Ad Servers


and / or Networks

2001 PricewaterhouseCoopers. All rights reserved.

You might also like