You are on page 1of 49

Audience Relations at SBS:

The Balance of Power


Submission by Tim Bennett for
Honours in Media and Communications (UNSW)
November, 2004

Word count: 14800

I hereby declare that this submission is my own work and, to the best of my knowledge, it contains no
material previously published or written by another person, nor material which to a substantial extent has
been accepted for the award of any other degree or diploma at UNSW or any other educational
institution, except where due acknowledgement is made in the thesis. Any contribution made to the
research by others, with whom I have worked at UNSW or elsewhere, is explicitly acknowledged in the
thesis.

I also declare that the intellectual content of this thesis is the product of my own work, even though I may
have received assistance from others on style, presentation and linguistic expression.

Signed: Date: 4 November 2004


Tim Bennett

1
Abstract

The Special Broadcasting Service not only produces programs, but also constantly produces and
makes use of traces of its own audiences. Its many feedback channels are geared towards
generating information that serves two purposes. Firstly, feedback empowers SBS to refine its
production practices in the never-ending task of fulfilling its core goals. Secondly, audience
feedback allows SBS to ascertain its success in achieving those goals; it provides SBS with
measures for judging performance. The politics of control are at work in these audience relations.
Producers attempt to discern how (or if) their control over audiences functions. Audiences in turn
exert influence over how producers approach their tasks. This relationship is in constant flux, for
contemporary feedback techniques ensure that the nature of the audience (as SBS knows it) is
heterogeneous and fluid.

Contents

Introduction: 3
SBS Television: 10
SBS Digital Media: 18
Public Relations: 31
Conclusions: 38
References: 41
Appendix: 43

2
Chapter One: Introduction

Like any broadcaster, it is an unavoidable fact that SBS staff make most of their production
decisions away from the immediate presence of the radio listeners, television viewers and online
users who constitute their audiences. How does SBS, from its base in Artarmon, come to know
about those audiences? What purposes do audiences serve within SBS? For that matter, why does
SBS bother with its audiences at all? Audiences are troublesome: getting to know them is difficult
and expensive; it sometimes seems that they can be quantified in any way one sees fit. What they
do say is sometimes unpleasant and counter-intuitive. One might imagine that broadcasters would
be better off without the audience.

But broadcasters truly need audiences in order to survive. It is one of the key tasks of a
broadcaster to attempt to know its audience. A range of feedback mechanisms inform the
broadcaster's knowledge of its audience. With multiple techniques of gathering feedback, there
are thus many images of the audience. It is accurate, in fact, to say that SBS has many audiences.
Each audience affects production processes in manifold ways. While knowledge of audiences is
always abstracted - it is an idea, or a re-imagination of feedback - this knowledge is often put to
practical use.

SBS has a wide range of institutional goals which are accomplished by feedback-informed
knowledge and actions. The goals of SBS are complex, made so by its dual position as a
commercial and public broadcaster. In particular SBS's responsibility to serve a multicultural
Australia necessitates constructions of an audience which include references to both diversity and
a united whole. The many audiences of SBS are increasingly recognised to be enmeshed with one
another, as demonstrated by the recognition that television and online audiences share many of
the same citizens.

Staff at SBS are well aware of the complexity (and often inadequacy) of audience feedback.
Nevertheless there are many practices in place which seek to achieve the goals of SBS by utilising
it. In this thesis I have documented some of these practices and how they contribute to the
effective functioning of the broadcaster as a whole. This thesis demonstrates the audience's key
role in broadcasting and its agency in institutional decision-making.

About SBS

The Special Broadcasting Service (SBS) is a radio and free-to-air television broadcaster, and more
recently an online content provider, primarily based in Sydney, Australia. It has existed in one

3
form or another for over twenty-five years. Initially a radio station operating only in Sydney and
Melbourne, SBS now broadcasts to most of Australia through its radio and television services. Its
online content is accessible throughout the World Wide Web.

'The principal function of SBS is to provide multilingual and multicultural radio and television services
that inform, educate and entertain all Australians, and, in doing so, reflect Australia's multicultural
society.' (Special Broadcasting Service Act 1991: Section 6)

SBS became a corporation through the Special Broadcasting Service Act of 1991. The Act sets out
the particulars of SBS's operations and the general spirit of its purpose. Section 6 of the Act sets
out the corporation's Charter, a document of surpassing institutional importance. (A copy is
provided in the Appendix.) SBS's programming operations must revolve around this Charter's
conditions, or face charges of incompetence and failing to serve the audience.

The corporation's funding comes primarily from the Australian Federal Government; it is
generally referred to as a public broadcaster, in the sense of the public service tradition of the BBC
and the Australian Broadcasting Corporation (ABC). Nevertheless its funding model also allows it
to collect revenues from commercial activity. This activity is primarily the sale of airtime and
webspace to advertisers. This hybrid method of funding is practically unique within Australian
broadcasting. It also burdens SBS with two sets of masters to whom performance must be
justified.

The type of content broadcast by SBS is another unique feature of the organisation. SBS transmits
roughly half of its television programming in languages other than English. SBS Radio features 68
languages each week. Online, SBS Radio websites are available both in-language and in English. It
is this multi-ethnic approach to broadcasting that often sees SBS branded an 'ethnic' broadcaster.
SBS prefers to label its approach 'multicultural'. The difference is one of potential audience:
multicultural broadcasting aims to address issues within the entire community, not just an ethnic
segment. Multicultural broadcasting aims to be inclusive.

This thesis demonstrates how the idea (and more practically, the construction) of audiences is
central to SBS's institutional practices. In a practical sense, audiences are unknowable and
imaginary, inferred from an institutional perspective only by the existence of audience feedback
(Ang 1991: 3-5). Constructions of audiences, assembled from the myriad forms of audience
feedback available, are the closest that SBS can come to knowing 'real' audiencehood. These
'virtual audiences' are:

4
'abstraction[s] constructed from the vantage point of the institutions, in the interest of the institutions.'
(Ang 1991:2)

In other words, audiences are constructed to be used institutionally. SBS, in seeking to achieve its
operational goals, relies heavily on utilising its constructed audiences. This thesis examines some
of those operational goals, as well as methods of collecting feedback and constructing audiences.
The practice of applying audiences in achieving goals is shown to be central to SBS's ongoing
activities.

Beginning with a fundamental examination of SBS's commercial and public service


responsibilities, this thesis then outlines the relationships of control evident between producers
and audiences. Following this are three major chapters outlining distinct areas within SBS. In each
chapter I have examined multiple cases in which audiences have had a noticeable effect on
institutional practices. Chapter Two explores how SBS-TV uses the audience for both
commercial and non-commercial activities. Chapter Three demonstrates three different methods
of considering SBS's online audience. Chapter Four examines the relationship between audience
knowledge and corporate public relations. In the conclusion I have outlined three broad groups of
practices that are each affected by audience feedback. Briefly these are evaluation, planning and
public relations. Practically all of SBS's audience-related processes fall within these three groups,
with some audiences helping to serve multiple tasks.

SBS's Commercial and Public Service Motivations

The two driving influences behind SBS are the public service and commercial broadcasting
traditions. While based on fundamentally different goals, these are certainly not incompatible
traditions. Public service broadcasters, in the fashion pioneered by the BBC, speak to an audience
of citizens in a manner calculated to have a certain social or cultural effect. Commercial
broadcasters treat the audience as a resource which can be utilised to generate advertising
revenue. It is acceptable to broaden the (usually government-supplied) funding base of a public
broadcaster by selling advertising space. Public broadcasters do come under fire when they are
perceived to be putting commercial concerns ahead of public service concerns. This is a fine line
that SBS is occasionally accused of transgressing. While marketing a program's audience is
accepted, the creation of a program driven by marketing is not.

'The relationship of public service institution to its audience remains essentially characterized, not by
economic profit-seeking, but by a pervasive sense of cultural responsibility and social accountability.' (Ang
1991:28)

5
Public service broadcasting may be thought of as 'giving the audience what it needs' (as opposed
to 'what it wants'). In SBS's case 'what the audience needs' was principally defined by the
Australian Government, and written into the operational guidelines known as SBS's Charter. The
Charter dictates the fundamental ways in which SBS is to address its audiences. It is, therefore, at
the heart of SBS's inquiries into its audiences. SBS programming must be assessed against the
Charter, both in planning and execution.

Whether or not Charter requirements are being met is based on a complex interpretation of the
category of 'audience'. SBS's management is expert in demonstrating how individual programs fit
the goals expressed in the Charter. One example is the defence thrown up around the 2004
program Desperately Seeking Sheila. In this program several men living in rural Western Australia
are introduced to prospective spouses, many who hail from the United Kingdom. The show
generated considerable controversy in late 2003 when it was first announced:

‘Before a bride-to-be has been recruited, there have been murmurs of 'dating show' and - worse - 'reality
television'. This, Canberra insiders have said, could be a betrayal of the network's charter. Liberal senator
Santo Santoro has warned that SBS is offering increasingly similar fare to the ABC and commercial
networks.’ (The Age, November 9 2003)

When questioned in a Senate Estimates committee hearing, managing director of SBS Nigel
Milan gave this defence:

'Basically, the program is based around the social problems of young farmers in remote parts of Western
Australia finding it difficult to find brides to go out and live in that part of Australia with them.'
(Hansard 2004-02-16: 79)

Head of Television Shaun Brown also shed light on how SBS seeks to meet its obligations under
the Charter:

'Looking at the charter, not every program is going to get a tick in every box – and frankly I think it
would be impossible to do so.' (Hansard 2004-02-16: 79)

Much of the discussion surrounding these two statements seeks to quantify the inclusion of non-
Anglo Celtic people as participants in the show. This demonstrates an underlying assumption that
SBS is primarily a network for non-English-speaking migrants and their descendants in the
Australian community. SBS management strenuously argues against this view, pointing out that
the network is not a 'ghetto' broadcaster. The idea of a multicultural potential audience, as laid

6
out by the Charter, shapes how programs are formulated and produced at SBS. The flip side to
this is that 'multicultural' is often held to be synonymous with 'non-Anglo'.

SBS is not only interested in its audiences for public service reasons, but also for commercial ones.
As discussed further in Chapter Two, SBS is limited in the amount of advertising it can carry. It is
also limited in how much it can pursue large audiences. SBS is predominantly seen as a niche
broadcaster, and its Charter goals are defined in such a way that large audiences (which attract
higher advertising revenues) are irrelevant to performance. Nowhere does the Charter say that
SBS programming should be popular. SBS is not supposed to be interested in mass audiences.
While nothing forbids it from achieving large audiences, there is a tacit belief that large audiences
are achieved by pursuing commercial success. As Managing Director Nigel Milan said in
November 2003:

'If popular appeal was the yardstick of successful television, then SBS would have to be a clone of Channel
9.' (Milan 2003)

SBS avoids 'chasing ratings' for commercial purposes. This puts it at odds with one half of Ang's
definition of commercial television:

'Commercial television can be characterized at several levels, but in its barest form it is based upon the
intertwined double principle of the making of programmes for profit and the use of television channels for
advertising.' (Ang 1991:26)

Thus the type of commercialism practised by SBS – and especially SBS-TV – is not that of a for-
profit broadcaster such as the Nine network. Programmes are not made exclusively for profit, but
the channel does carry advertising. SBS does not focus on maximising its audiences, but it
certainly uses its audiences as a resource.

A notable exception in terms of audience maximisation is in Digital Media. In that department


larger numbers are always better. SBS aims to achieve month-on-month growth in audience
figures:

'Monthly, we will achieve minimum increases in traffic of 10% throughout 2005.' (Digital Media
Campaign Eight, see Appendix)

This tends to attract little notice, especially since very little advertising is present on websites –
therefore Digital Media's motivation is not thought to be commercial. Digital Media is also a

7
relatively new part of SBS's operations, still trying to justify the money invested in online
production. Growth in audiences is therefore correlated with success.

The public-service values of SBS can only continue to be compatible with its commercial values if
the pursuit of audience enlargement is limited, it seems. While there is no physical law which
dictates that entertaining, educational and informative multicultural programming cannot achieve
large audiences, the body of evidence demonstrates that this may indeed be the case. Staff at SBS
also realise that the broadcaster operates in a larger media environment dominated by television
stations which routinely achieve more than five times SBS's average audience.

The Operation of Control in Audience Relations

'Quite simply, the people who want to know about audiences, want to know information about them
which can be used to control them, and make their behaviour more predictable.' (Nightingale 1985: 8)

Audience relations are characterised by constant struggles for control. SBS requires audiences; its
Charter goals are predicated on dealing with a certain type of audience, an audience which
needn’t necessarily watch SBS. Therefore SBS must constantly attempt to maintain and grow its
audiences. The institutional discourse around audiences typically focuses on attempting to control
them. Audience measurement's primary goal is to ascertain the effectiveness of SBS's control over
its audiences. Naturally this is not manipulative control. It is, as Ang states, 'of a discursive rather
than a material nature' (Ang 1990: 8). There is no method of programming which forces people to
take part in being an audience.

However, producers assume that they possess control in some form - usually persuasion or
attraction of the audience. Certain programming moves often appear to correlate with certain
changes in audience feedback; this equates in the minds of producers with 'control over audience
behaviour'. The primary use of audience feedback is to aid producers in controlling the audience.
Producers are empowered by audience feedback to attempt control of an audience. This is only
half of the picture. The other half is that producers frequently respond to audience feedback. This
demonstrates that audiences also have an element of control over producers, albeit one that is
generally unconscious and limited. One might say that there is no method of feedback which
forces producers to take notice of the audience, but that it does happen.

So audience feedback is itself a form of control over program makers. Specifically, audience
relations can be discussed within the framework of Deleuzean societies of control. It can be seen
that control between SBS and its audiences does not rest in any one place, and does not cease

8
movement. Producers are always in the act of trying to assert control over the audience through
influencing audience behaviour. Similarly, audiences are constantly provoking reactions within
producers. Control is never complete or unidirectional. Deleuze points out that control is not
exercised through 'molds' but through 'modulations' (Deleuze 1990). Producers are constantly
modulated by the feedback coming to them from audiences. The effects of this process are seen in
the remodulation of SBS content (and by extension, SBS audiences) by producers who seek to
address a certain audience. 'Limitless postponement' (Deleuze 1990) is another feature of this
relationship, in that the task of audience management is never complete.

Examining audience relations in this light suggests a possible reason for another shift in public
broadcasting. Ien Ang, in 'Desperately Seeking the Audience', comprehensively details the growing
reliance on empirical methods of feedback by public broadcasters (Ang 1991:140-152). She
contrasts empirical feedback with the 'normative aspects involved in the very idea of public service
broadcasting' (Ang 1991:148). That is, a reliance on empirical measurement seems at odds with
the paternalistic tradition of public broadcasting in which the broadcaster 'aimed to change, not
anticipate, audience taste' (Ang 1991:141). The social project of SBS would, under such a system,
take precedence over the audience's desires.

I suggest that empirical measurement has gained the upper hand at SBS because producers feel it
is of greater assistance in controlling the audience. Audience measurement allows SBS to
participate with audiences in the economy of control described above. Empirical measurement,
which is generally held to be scientific and objectice, allows producers to constantly tweak their
methods of control. Thus the distance between broadcaster and audience is diminished, and the
cycle of production and response quickened. Producers feel that the audience is better served by a
responsive broadcaster, a broadcaster that is in touch with its audience. For producers, empirical
measurement seems like the ideal way to exercise control over the audience.

9
Chapter Two : SBS Television

The Television arm of SBS is the organisation's most publicly visible aspect, available to
Australians in most communities larger than a few hundred people. As the part of SBS with the
largest audience, SBS-TV has the most invested in coming to terms with its audience. That is to
say, SBS-TV has the most urgent need for valid audience constructions and narratives. These
audience constructions and uses are specific to SBS's distinctive needs as an organisation.
However, on some levels these methods operate in the same fashion as those deployed by
broadcasters with quite different goals. Examining how SBS-TV constructs and employs audiences
reveals much about how it functions as a broadcaster.

In the initial stages of audience feedback collection, SBS-TV operates in much the same way as
the other television stations of Australia. This is hardly surprising, given that the Australian
networks rely on a single system for the production of television ratings data. It is the uses of this
data by SBS that produce illuminating insights into the broadcaster's difference from other
Australian networks. The staff in SBS's television production, publicity and marketing
departments are able to accomplish many of their core tasks by using ratings in conjunction with
other surveys of SBS television audiences and the Australian public in general.

Television ratings documents are interesting products in their own right. I have defined television
ratings as traces of audience activity, which are generalised from a survey panel to the broader
population. Within Australia, survey panels are maintained by ATR Australia, the Australian arm
of the AGB Italia Group. The AGB Italia Group installed the first national audience survey panel
in Italy, in 1981 (ATR Australia, 2004). Television Audience Measurement (TAM) data, or
ratings, is collected by ATR on contract for OzTAM, a company which presents ATR's data to
Australia's broadcasters.

The survey panel presented by OzTAM consists of roughly three thousand homes distributed
through Sydney, Melbourne, Brisbane, Adelaide and Perth. According to Emelia Millward at SBS:

'SBS does not subscribe to the regional 'elemental data', which means we don't have access to the minute
by minute data like we do for the metro markets. This means we can't see how a program performed in
regional markets the next day & we can't get a 'national' viewing figure for programs. But we do get buy
[sic] a quarterly summary of some SBS viewing measures for regional markets so we can track some
trends.' (Millward 2004)

The '5 City Metro' Ratings cover roughly 64% of the Australian population – those living in the

10
five cities listed above (Australian Bureau of Statistics, 2003). Ratings collection possibly focuses
on these markets for commercial reasons (because city inhabitants would tend to be wealthier,
and thus more marketable) and technical reasons (because it is presumably easier to maintain a
survey panel within a concentrated area, as opposed to a distributed area). Therefore it is safe to
say that the SBS audience in regional Australia is literally invisible in the daily ratings figures
obtained by the television marketing department.

These ratings documents are electronically distributed daily to relevant persons throughout the
organisation. The daily ratings reports generally contain a top-level summary of the previous
night's best-performing programs. Often these are accompanied by comparative statistics and
theories about how particular numbers came about:

'Sunday Night

Evening share was 3.6%. Top programmes were Desperately Seeking Sheila with 326k ['k' = thousand
viewers] and World News with 218k.

Sheila achieved the second highest audience for the series. It benefited from Channel 10 scheduling
[Australian] Idol an hour earlier due to the ARIA broadcast at 19:30.

Note: All audience figures are for mainland capital cities only.' (Daily ratings, 2004-10-18)

Attached to each email is a spreadsheet with the ratings figures for the previous night's prime-time
ratings period (6.30pm - 10.30pm). These daily ratings emails serve two functions. Firstly they are
a courtesy to individuals who might be concerned with how a particular program is performing.
Secondly the reports can provide emotive 'spin' about certain programs – this interpretation of
ratings is generally, though not always, positive in nature.

Ratings are 'traces of audience activity' because they cannot conclusively be proven to correlate
perfectly with viewing behaviour. The audience activity in question is an interaction by survey
panel members with the ATR 'Peoplemeter', a device which 'records and stores four pieces of
information: time, TV set on/off, channel tuned [and] persons viewing.' (ATR Australia, 2004) A
wealth of academic research calls into question the relationship between the 'persons viewing'
figure and who is actually watching (Ang 1990, p154). Additionally, the quality of viewing remains
unknown in these four pieces of information: does the audience have a high level of engagement
with the program? Or is the program only playing in the background while other activities take
place?

11
These concerns are rendered somewhat irrelevant by the simple fact that, with rare exceptions,
TAM data is accepted as having 'face validity'. Simply put, 'face validity' is the 'judgement that a
measured variable really measures the phenomenon it represents' (Gunter in Bruhn Jensen, 2002:
212). ATR cites its mission as being:

'To establish a common currency used by TV Stations, Media Planners and Advertisers for their
advertising transactions, based upon a reliable, independent, and transparent audience measurement
system.' (ATR Australia, 2004)

Australia's TV stations, media planners and advertisers largely agree that the ATR data provides a
lingua franca for discussing the performance of media properties – and that is good enough to
allow ATR's figures (as provided by OzTAM) to be put to use within SBS. Nevertheless, the
organisation feels that OzTAM data often serves SBS quite poorly. One of the main reasons given
is that SBS typically rates more highly with non-Anglo Australians, who SBS feels may be under-
represented on the ratings panel. Pat Quirke-Parry, Head of Sales, explained:

'When the random sample is being constructed - interviewers knock at doors and they are all only
required to speak in English - so if they come across a home where English is not the first language and
communication is difficult or impossible they will simply select the next random household. Hence our
audience is almost bound to be significantly underrepresented in what is otherwise a perfectly correct
random sample.' (Quirke-Parry, 2004)

Another reason is that SBS, with a maximum audience at any one time of around five percent of
all TV viewers, relies on a smaller number of panel members for its measurements than do the
other stations. Having fewer panel members decreases the statistical accuracy of any
measurements performed on TAM data.

The ATR quote above intimates that TAM data is generally used for setting the price of buying
advertising space on a commercial network. SBS uses the data in this sense, but it also uses the
data for non-commercial purposes, often in quite creative ways. I will first examine the
commercial, and then the non-commercial uses of television feedback.

The Commercial Aspect

'In 2002, SBS Television was watched by more than 7.7 million Australian each week.' – (SBS 2003:
21)

12
SBS is permitted to run five minutes of advertising per hour of programming. Any criticism of this
agenda is usually tackled by SBS with the assertion that advertising revenue allows SBS to fund a
wide variety of local content production. In any case, SBS's advertising is hardly intrusive, running
in blocks between programs and not interrupting the flow of programming.

Audience traces have significant value to the organisation. TV Marketing uses its ratings data to
set the prices of SBS's 30-second advertising blocks. A rate card summarises the price for a block
at any time of the day. Blocks can be purchased in six different markets: national (total coverage),
Sydney, Melbourne, Queensland, South Australia and Perth. The reader will notice that these
areas overlap with the five cities from which SBS receives daily ratings, suggesting the use of
ratings in setting these prices.

Part of the Marketing department's repertoire is an industry website called 'SBSin'


(http://www.sbs.com.au/SBSin/), not to be confused with the commissioning arm 'SBS
Independent', which is often referred to as 'SBSi'. The site serves as publicity and marketing from
SBS to other industry areas, especially advertisers. The entire site is geared towards extracting
maximum promotional value from the audiences that SBS has constructed through its research.
Sections such as 'Why Advertise on SBS?' rely heavily on talking about an audience:

'SBS viewers are well-educated and more likely to work in professional, managerial and upper white-
collar occupations. They have high discretionary spending power, especially valued by marketers in
automotive, financial services, communications, computers, travel and tourism, entertainment and leisure,
government advertising and many other major categories.' (SBSin 2004(1))

The site also provides presentations of the sort usually aimed at persuading media buyers to
advertise on SBS. Audiences feature prominently in these presentations, as advertisers are
primarily interested in who will likely be viewing their advertisements. The following examples
come from SBSin's 'SBS Television 2004' document:

'- Viewers rank SBS program quality highest of all FTA [free to air] networks' (page 8)
'- Audience Phone Feedback: Entertaining, Challenging, Intelligent, Worldly, Controversial, Informative
[etc.]' (page 9)
'- SBS continues to reach more than twice Pay-TV's audience' (page 18)
'- SBS viewers profile well above the population for:
- Degree 117
- Professional 110 [population average 100]' (page 20) (SBSin, 2004 (2))

13
Interestingly, any doubts or questions that SBS may have about the validity of its data cannot be
seen on the website. Whatever the truth of the figures and the audiences they construct, SBS
markets them as certain and reliable. My implication here is not that the marketing department
seeks to deceive its advertising clients. Advertisers are not duped into accepting these figures. SBS
and the advertisers are able to put these audiences to use because they are agreed to be good
enough for the purpose.

This relationship is a consensual hallucination which continues because of its ability to help get
the job done. SBS earns a lot of money (which advertisers are willing to pay) through an agreed-
upon system of exchange: show us the audiences, and we'll show you the money. This is an
example of the audience constituting an empowering resource for SBS. Ratings are no guarantee
of future performance, and do not indicate whether or how much a viewer has paid attention to
an advertiser's message. Still the system continues to function, in the faith that it does have utility
for those involved in it.

Non-commercial Motivations and Uses

The commercial uses of the television audience outlined above are certainly important to SBS
Marketing. However the same audience constructions are routinely used for a variety of other
non-commercial purposes. The core business of SBS television is to populate its broadcast
spectrum with programming that helps the broadcaster to fulfil its charter obligations. The
charter, as explained in Chapter One, suggests a style and an audience for programs, but leaves the
specifics for SBS to decide.

The presence of an audience always affects the organisational practices of those working in SBS-
TV. I will demonstrate three examples where audiences make a difference in the practices
surrounding television programs. The first of these is Monday nights on SBS-TV. For several years
now, programming in this time slot has revolved around maintaining a certain type of audience.
The second is the 'genre audience', where producers have such a gut feeling for the audience that
the audience becomes a background consideration. Both of these cases are examples of SBS's
attempts to control the audience – although a better analogy might be 'baiting the hook'.
Manipulative control of audiences is not possible, but knowing how to attract them is the next
best thing. The third example is where television ratings are used in processes which are not
commercially-motivated. Though ratings are usually intended to set the price of advertising, they
can be used as a valuable resource for program makers.

Scheduling television programs is achieved by addressing the basic questions, 'What do we put on

14
television, when, and why?' What, when and why are all related to considerations of the audience.
After examining SBS's Monday night programs one might respond to these questions, 'Light
entertainment and comedy, Monday nights, because it works'. Repetition is a tool which can
build up predictability in an audience.

In 1997 SBS began broadcasting the cult animated program South Park. The audience for this
program grew quickly, based around a core audience of younger male viewers. In its regular
timeslot of 8.30pm Mondays, South Park provided SBS with some of its highest ratings while it
screened. It was controversial and edgy, but still explored social issues in an entertaining fashion.
It also attracted viewers who were much younger than SBS's typical audience of that time. South
Park could be described as a surprise hit, but it had a lasting impact.

From that time onwards, Monday nights have been home to many of SBS's comedy programs.
Amongst these are Pizza, Life Support and more recently John Safran vs God. SBS found itself with
a particular audience for its Monday night programs, one which (through its large size, especially)
was attractive to advertisers. It sought to retain this audience by assigning a theme to Monday
nights: comedy and light entertainment. (This contrasts with much of SBS's documentary, current
affairs and foreign language programming.) In doing so SBS was laying out the bait to catch a
recurring audience, to 'grow the audience' as industry-speak might have it. Scheduling the same
sort of material in repetitive ways is an excellent method of encouraging patterns of behaviour in
audiences, a form of persuasive (though not manipulative) control.

Another type of predictability is found in 'format' programs. The internal structure of some
programming is so predictable that it almost seems to ignore the audience. The World News on
SBS-TV is one example. Reporting current events and presenting them in relatively the same way
every night is a key part of any news service. It may seem that neither of these practices considers
the audience; one might think that these tasks are audience-independent. In fact the structure of
news and current affairs programs relies heavily on audiences. It is just that those audiences are
taken for granted to such an extent that they rarely receive explicit consideration.

The World News operates with a consistent format, but is obviously different from other news
programs. This is a choice related to the idea of a multicultural audience. The World News deals
mostly with international issues, often providing global context for national stories. The weather
report at the end of the program traverses the entire globe. Sports stories are rarely featured,
unless they involve major events such as the Olympics. (SBS runs a sports program immediately
after the first evening bulletin, hence the lack of sports coverage during the news.) These are all
choices which demonstrate the producers' priorities in serving the audience.

15
Mark Maley is executive producer of the Insight program. Insight debates one 'hot topic' per week
in a discussion-panel format before a studio audience of people who are relevant to that topic.
Choosing the particular topic for a week comes down to a judgement about what will be of
interest to the audience:

'There is a lot of consultation within the [production] unit, but ultimately it's about relevance to the
audience. It's about finding issues that people themselves are talking about, or are themselves affected by...
As a current affairs program, we fulfil a fairly important social function. And the essence of that social
function is to analyse issues, to tell stories, that are relevant to the lives of a reasonably broad audience.'
(Maley 2004)

This is an example of the gut instinct that producers often have about audiences. Audiences are
always present in their thinking about production, even if those audiences aren't explicitly
referenced as a reason for doing something.

'It is very much my job to think, 'Is this going to be of interest to a viewer? Is this going to irritate a
viewer? Is this going to impassion a viewer, or move a viewer?' It's a subtle imaginative process because
there is no single [type of] viewer out there.' (Maley 2004)

Of course SBS's producers also use ratings, though not for commercial purposes. They use ratings
as a marker – often the primary marker – of a program's relative effectiveness in serving the
audience. This is a sensitive topic, as most, if not all, SBS producers would abhor the accusation
of being 'ratings driven'. But television is a very expensive exercise, and to be unable to
demonstrate the impact of programming would raise the question of whether public funding is
justified.

'Ratings are a smaller part of the picture, but still part of the picture. They are the only objective evidence
you have of the size of your audience, and whether you're relevant. And there's no point putting a prime-
time show out to [only] 20,000 people – that is a failed exercise.' (Maley 2004)

The use of empirical measures in this way is part of what Ien Ang describes as 'a shift away from
reliance on the a priori, normative knowledge about how the audience should be addressed, which
is part and parcel of classic public service philosophy.' (Ang 1991: 104) In other words, this is
evidence that public service broadcasting is increasingly driven by ideas of the audience as a
market, rather than a citizenry. The basic direction of SBS programming is driven by the charter,
which considers a citizen audience, but the specifics are overwhelmingly governed by market data.
The characteristics of a citizen audience change slowly, mostly as its demographics change, but a

16
market audience can change as fast as its measurements are collected – daily, in the case of ratings.
The attempts of producers to control audience behaviour demand that empirical measurement be
utilised.

17
Chapter Three : SBS Digital Media

The room that houses most of Digital Media at SBS is tucked away at the back of the building,
near the loading dock and security office. Its lack of natural light often leads it to be alluded to as
'the mushroom farm' or 'the bunker'. As physically removed as they may be from daylight and the
rest of SBS, Digital Media's staff are actually quite well-versed in relating to its audiences.

Digital Media is a part of SBS's division of Technology and Distribution. It is mostly responsible
for the design and development of websites and related technologies (for instance, uploading video
archives of shows). Another of Digital Media's tasks is the periodic evaluation of web content.

Many websites are developed as support for or extensions of television or radio programming.
Digital Media assembles sites and then trains the relevant television or radio program's staff in the
day-to-day maintenance of the site. Major redesigns of sites are occurring constantly, a process
which brings them back into the Digital Media domain.

My experience with Digital Media was gained during the nearly nine months I worked there, from
February to mid-October of 2004. While working in the role of Audience Research Producer I
was constantly involved in the process of interrogating the online audience and reconstructing its
feedback.

In this section I will be examining three major areas in which Digital Media encounters and
relates to its online audiences. Each method constructs the audience in a distinctly different way,
foregrounding the context-specific salient qualities of that audience. Ratings harness the
descriptive qualities of a census of past activity, usually evaluating instead of trying to predict user
behaviour. User surveys focus on areas of website usage which cannot be inferred from ratings,
allowing a more user-oriented (as opposed to usage-oriented) evaluation of websites. Usability
testing sacrifices the statistical safety of a large sample for observation and 'gut feeling' – it makes
an informed extrapolation from a very small group to a very large group.

The online audience is often considered to have more control over its interactions with the
medium than the television audience does; hence a website's online audience is frequently referred
to as its users. The term reflects the non-linear format of websites and the expectation that users
will navigate (and use) a site in a variety of ways (in contrast to the linear start-to-finish format of
standard television). The major consequence of this difference is that Digital Media puts more
effort than SBS-TV into imagining the audience's ability to use, rather than consume, content.

18
In Digital Media the terms 'audience' and 'users' are used interchangeably, but when talking to
television and radio staff the preferred term for people using SBS websites is 'users'. This prevents
confusion between the 'viewers/audience' of a television program and the 'users' of the program's
related website. The distinction between ‘audience’ and ‘user’ highlights that producers still
regard television and online audiences as rather separate, although user surveys have demonstrated
that there is a large crossover between the two.

SBS's Digital Media department labours under the same crisis of knowledge as SBS in general.
There is a deep organisational need to construct audiences, to measure and rationalise them, and
to use those audiences for further purposes. As with SBS Television, Digital Media employs a
specialised range of techniques which directly facilitate institutional uses of the audience. At the
same time the deeply porous nature of SBS can be seen in the thorough permeation of Digital
Media's activities by the audiences Digital Media constructs. The three main ways in which the
online audience (or user) is constructed by SBS, and the different ways that these audience
constructions are used within Digital Media, are as follows.

The first method of collecting feedback is through the RedSheriff Customer Intelligence system,
which operates in a similar fashion to the research methods employed by television marketing.
Online ratings data is widely agreed to be the most objective information available; it surveys all
users instead of just a panel of users, thus dodging some statistical uncertainty. However it still
carries its own problems when used to construct an audience.

The second feedback is the suite of annual user surveys deployed by SBS on several of its websites.
These surveys, while aspiring to the statistical objectivity claimed for ratings data, utilise much
more qualitative and specialised purpose-based data collection techniques. Surveys also come with
a large range of drawbacks which must be dealt with or rationalised.

Finally the producers working in Digital Media are constantly aware that their creations need to
be usable. To this end, producers always consider the audience's abilities as internet users. To
extend this knowledge, Digital Media conducted a small-scale user testing project in July 2004.
This method allows a very close engagement with site users, and the ability to see how 'fresh eyes'
interpret the site. While sites in development are always tested, this is the first time that
formalised user testing has been adopted.

SBS's Technology and Distribution division assesses its performance in ten broad sectors called
'Campaigns'. Campaign Eight is entitled 'Maintaining and growing our strengths in digital media
creations'. It is the Campaign which most closely involves the use of audience feedback, and thus

19
the one which I will examine here. The 'Key Result Areas' (KRAs) of this campaign are:
'The Division will grow its online audience;
The Division will develop a sustained online community of loyal visitors; and
The Division will extend its content offer through interactivity.'
(SBS Technology & Distribution Campaign Eight – see Appendix)

These KRAs delineate the importance of listening to audiences and developing a relationship with
them. Digital Media must overcome the gap between audiences and producers to assess its
performance in Campaign Eight. Staff in Digital Media utilise the three audience constructions
introduced above – ratings, surveys and usability – in order to do achieve this.

Of course, we did not only use these constructions for evaluative purposes – we used audience
constructions to inform some creative decisions in regards to developing our content further. All
three methods of constructing and deploying audiences demonstrate the institutional value found
in a meaningful engagement with feedback gathered from the audience. Each method also
demonstrates how audience relations are used to judge or attempt change in an audience, as well
as how audiences influence the behaviour of producers.

RedSheriff

At the most quantitative end of Digital Media's feedback spectrum is the data produced through
the RedSheriff Customer Intelligence ratings collection system. The RedSheriff system is
universally regarded as a valuable resource within SBS Digital Media. It directly tackles the crisis
of knowledge at the heart of broadcaster-audience relationships, doing so in ways that reduce the
divide between users and producers. A key difference with television ratings systems is that the
RedSheriff code on SBS websites collects information from practically all users, rather than a
sample panel. The RedSheriff data therefore represents not a statistical extrapolation of audience
feedback, but a solid representation of the actual feedback quantities involved.

RedSheriff, acquired by Netratings Inc. in mid-2004, is a webpage data collection and


organisation tool. RedSheriff provides SBS with website code which allows RedSheriff to gather a
large amount of information about SBS's website users. This information is then organised into
searchable web-based reports for SBS.

'Web sites that run RedSheriff Customer Intelligence include a few lines of Java script [sic]
(instrumentation) into every Web page. When a page is loaded into a browser, the instrumentation sends
information to the front-end machines which collect the data. The data is then processed by back-end

20
machines and made available to the client in the form of reports.' (RedSheriff 2004)

These reports are not the end of the story – SBS Digital Media develops a working knowledge of
its online users through an open-ended exploration of the reports provided by RedSheriff. The
collection agency simply provides data summaries; it does not give recommendations, instead
encouraging Digital Media to adopt the mantle of data interpretation.

RedSheriff's system provides statistics such as number of page impressions, number of unique
visitors, user visit frequency and duration and addresses of pages from which users have arrived.
Information is presented in numerical and graphical form and is currently able to be viewed in
daily, weekly and monthly brackets. Additionally SBS has its results delivered in roughly thirty
sub-reports, categorised by site. It is thus possible to compare the usage statistics of The Movie
Show and The World News websites (or any other combination), as well as view an aggregated
report across all SBS web properties.

Whatever proofs are put forward about the scientific reliability of statistics and survey panels, a
census of the same population (the statistical term for the group represented by a sample) will
always represent the same data more accurately. Survey panels work best when measuring
television viewing because only five channels are commonly available in Australia (excluding
subscription viewing). The World Wide Web has far too many sites for a panel-based system to
work, especially for a niche broadcaster such as SBS.

Importantly, RedSheriff also collects feedback that is an integral part of the user's media
experience. Television ratings are triggered by the conscious action of a sample individual (a
button press) who knows that s/he is being monitored, and that the action is related to that
monitoring. RedSheriff collects information with no user input directed at shaping the process. It
does not come from conscious user input (although some of what it represents is conscious
activity), assembled instead by automated processes. The fact that users cannot typically avoid
having their visit recorded goes a long way towards addressing concerns that the outcome of the
census procedure could be affected by knowledge that the census is being performed. That is,
RedSheriff collects information about all visitors, whether they consent to it or not.

The properties of near-total survey size (because it is possible that a negligible number of site
users will not have their visit measured, for miscellaneous technical reasons) and observational
data collection, discussed above, are enough to give RedSheriff data very good 'face validity' for
the items they measure. This means that the data is accepted as a true representation of what it
purports to measure. However it cannot always be seen to have 'predictive validity' as user

21
activity is always changing. Ratings report past behaviour but cannot always be relied upon to
predict the future. (Gunter in Bruhn-Jensen 2002:212)

In regards to the campaign result areas discussed above, RedSheriff ratings are very useful in
evaluating the first KRA, 'The Division will grow its online audience'. Carl Hammerschmidt,
Digital Media's Executive Online Producer, includes the monthly figures in his reports for
Campaign Eight. Usually a short explanation accompanies the report, outlining the reasons why
the numbers are what they are.

'The World News website traffic grew by 3%, which is a healthy sign after December's drop and reflects
the user base returning to work and [their] usual consumption patterns after the holiday season.

'The World Feast website lost 23% traffic, effectively falling back to
the pre-competition level. This is a clear indication of the impact
competitions and on-air promotions has [sic] on traffic.' (Campaign Eight report, January 2004)

The presence of Red Sherrif data in the production process illustrates again the dynamics of
struggles of producers to retain some control over their audience. The degree of control that
producers have is measured primarily by how many visitors RedSheriff has recorded on SBS
websites, and how this figure compares to previous months. This information can provide
compelling reasons for action, especially when visitor statistics experience a rapid change. Such an
event happened in late September 2004 with The World Game television show's related website.

A recent redesign had given the page a new look - it was lighter in colour and generally appeared
less crowded. The new page layout was applauded by Digital Media and SBS Sport staff alike.
However within a week it became apparent, through RedSheriff figures, that traffic to the website
was suffering a marked decline. Traffic from August to September was down 700,000 page
impressions. Unique visitors to the site declined from 180,000 to 150,000. The average daily
figures for October (monthly figures will not be available until after this thesis is finished) show
that the slide is continuing.

These traffic levels were the worst that the site has experienced for over 18 months. An entire
year's growth had been wiped away in two months. Obviously something drastic had happened,
but what? Seasonal variations may account for some of the change – a similar drop was
experienced at the same time last year. The magnitude of the change was slightly larger in 2004,
but could well be within normal variances.

22
The explanation favoured within SBS is that the site redesign was the direct cause of the losses.
With perhaps greater diligence than has been applied to the site in some time, several speculative
'flaws' were identified and rectified as quickly as possible in early October. These included the
repositioning of some news items higher on the page, and a general effort to make the online
offering more independent of the related television program.

The changes reflected a visualisation of 'what the users wanted' when they visited the site. The
producers relied quite strongly on rationalising the needs of the users and how the site would
address those. For instance, the higher profile of some news items was related to the intuition that
site users were more likely to want soccer news than television scheduling information.

This crisis would not have been possible without the RedSheriff figures. It became all the more
urgent because the losses coincided with an action taken by SBS (the site redesign) – and thus
were the two conflated. Control over the audience was assumed to be direct, even with a similar
previous drop in figures taken into account. In this way the audience was directly present in the
decision making process; not only were ratings used, but also an instinctive supposition about
'what the audience wanted'. The audience undoubtedly affected producer behaviour.

User Surveys

There is only so much information that can be extracted from the RedSheriff ratings data.
RedSheriff collects only a few basic measures from each user, managing to build quite complex
representations of user behaviour from such information as time of visit, visit duration and
frequency, and number of pages accessed. However each site is measured in the same way, and
with the same shortcomings. Chiefly RedSheriff is particularly limited in its descriptiveness of
audience members (for instance, their demographics), as well as lacking the ability to forecast
some types of audience behaviour. Digital Media also makes limited use of the figures beyond its
own department, fearing that audience figures may be used institutionally 'as a stick to beat us
over the head with', as one employee privately confided. To address these knowledge gaps, Digital
Media began surveying the users of some of its sites in 2003. It repeated the exercise in April
2004, a process in which I was deeply involved.

The initial planning document for the 2004 survey gave a retrospective overview of the 2003
survey, citing that the previous year's survey was designed 'to establish a precedent for gathering
quantitative and qualitative information regarding SBS's online audience'. (Meers 2004) Key goals
of the 2004 survey included:

23
'- establish a comprehensive model of New Media surveying activities for 2004;
- survey the audiences of a broad range of SBS websites; and
- gather site-specific data which are relevant to the needs of site administrators.' (Meers 2004)

Importantly, one of the survey's measures of success was that 'individual website developers will
have an increased understanding of their specific audience, resulting in user specific online content
upgrades, greater cross-promotional potential and a better understanding of how best to utilise
online sponsorship'. (Meers 2004) It was implicit in the planning of this exercise that the role of
the audience could extend to modulating how producers addressed that audience.

These goals demonstrate that SBS was not only planning to compile an archive of audience
feedback, but also to use that data for 'the needs of site administrators'. Not documented, but
certainly evident in SBS practices, is the use of the survey results in making evaluative statements
about how SBS Digital Media contributes towards the overall activities of SBS.

The surveys are not only useful for evaluating websites, but also for making production decisions.
At the time of writing, no major planning decisions had resulted from the 2004 surveys. However
the surveys of 2003 resulted in a number of new and modified site features, including:

- a 'By Popular Demand' graphic accompanying new site features;


- 'Desktop Delivery' (www.sbs.com.au/desktopdelivery), a service which shows all available SBS
email newsletters;
- the redesign of The World News website;
- in-language versions of the SBS Radio pages;
- an administrator tool to put cross-promotional buttons on other SBS sites (for instance, a The
World News button on the Dateline page);
- a television campaign to promote The World News website; and
- the launch of The World News email newsletter.
(Harcourt 2004)

The two surveys of the online audience that SBS has performed have focused on constructing two
main sets of results: characteristics of the audience that aren't (or can't be) measured by
RedSheriff, and audience responses to specific questions about SBS websites. In the former
category are demographic measures such as age, gender, location and language spoken at home. In
the latter category are questions about what parts of the site the survey respondent uses and how
satisfactory the respondent finds the site, as well as what features the respondent might like to see
on the site.

24
The actual surveys appeared as a pop-up window when the SBS sites involved (Dateline, The
World News, The World Game and The Movie Show, as well as The World Feast in 2003 only) were
loaded into a standard web browser. Users were first asked whether or not they wanted to
complete the survey:

'As a valued user of SBS online we'd like to ask you a few short questions about [this website]. The survey
will only take five minutes and your response will mean we can better tailor this website to what you
want. Thank you in advance for filling in this survey.' (Online survey 2004 – see appendix)

Upon completing the survey, or opting not to fill it in, users received a 'cookie' – a small piece of
information which prevented the survey from loading upon the user's next visit to the website.
Some questions were multiple choice, others just a check-box or a pull-down menu (for instance,
for specifying an hour of the day). One question on the surveys in both 2003 and 2004 was an
open-response text box in which users were asked, 'What do you like about [this website]? What
improvements would you like to see?' This text box was a concession to the essential
unpredictability of the audience, and an acknowledgement that SBS often possesses only bulk
statistical data, not qualitative or individual responses. It may be inferred from this that SBS
understands the value of occasionally receiving feedback on the audience's terms. The hope is that
the audience may reveal an agenda which was previously unknown or unimportant to producers.

Central to the task of designing the survey was the necessity to make completing the survey
painless and quick. One process of streamlining was to ensure that only essential questions were
asked of online users. The final selection of questions thus demonstrate the institutional
knowledge that SBS wishes to possess.

The questions can be sorted into two groups depending on the aim of each. The first and largest
group is the 'evaluative' question, the type which builds up a picture of an audience and arranges
it in segments. The second type of question is the 'planning' question, which asks the online
audience to speculate about potential behaviour. Examining the questions asked in each group
gives some interesting clues into how SBS will end up using the information.

For example, why did the surveys ask respondents to identify their age group? The primary reason
is because the age of SBS's audience is the subject of much scrutiny, both internally and
externally. Age is widely regarded as one of the most important demographic categories because it
is a variable that supposedly relates to tastes – in SBS's case, media tastes. The website audiences
analysed by SBS showed clear age biases, as demonstrated below by the 2004 results for the
websites of The World Game and Dateline:

25
Question: Which age group are you in?

Age Group The World Game Dateline


Under 18 15% 4%
18 to 24 36% 16%
25 to 39 34% 33%
40 to 54 12% 28%
Over 55 3% 19%

(Meers 2004)

The survey revealed that Dateline's online users tend to be older than those of The World Game.
Constructing an audience according to its age profile allows Digital Media to develop 'gut
feelings' about the people using SBS websites. Age profiles also allow comparisons to be drawn
between online and television properties. Interestingly the web audiences surveyed in 2004 tended
to be younger than the television audiences for the same program. Combined with data which
shows that most website users are also viewers of the television program, SBS can begin to get an
idea about the correlation between television viewers and online users. For instance, many
Dateline users visit the website to find out what stories will be on upcoming episodes of the
television show.

Another reason for assessing the age of online users is so that SBS can determine whether it is
meeting its obligations to broadcast to all Australians. Just as in television, it is not enough to
argue that the availability of content to all Australians is the same as broadcasting to them. If SBS
cannot demonstrate that a wide variety of audience members are engaging with it, then any claim
that it serves all Australians becomes more tenuous.

The online survey assessed Digital Media's performance in this area not only in terms of audience
age, but also gender, language spoken and location within Australia. The latter measure divided
users into states and also into regional-capital city groups. A fifth question asked for the
occupations of users, but does not figure largely in the survey report.

These four categories are the characteristics that interest Digital Media in its definition of an
Australian audience. It is these categories which are used to construct an audience within the
26
context of broadcasting to all Australians. For instance, the 2004 survey report 'Who are our
online audience in 2004?' cites a growth in numbers of regional users over the past 12 months as a
success:

'The general increase in Internet uptake in regional areas [up from 18% in 2003 to 25% in 2004]
revealed by this survey dovetails with the promotional campaign that has been undertaken to advertise
SBS transmission upgrades in regional areas.' (Meers 2004)

The 2003 survey also hailed the percentage of regional users as a success:

Dateline – Are users based in a capital city or a region?

Capital City 82%


Regional 18%

Breakdown mirrors general population split so the web site is not overly biased.
Good indicator of national coverage and achieving Charter obligations to reach all Australians.
Opportunity to interrogate our regional viewers via the web sites.
(Harcourt and Wong, 2003:18)

This contrasts strongly with information provided by the Australian Bureau of Statistics:

'At June 2003, capital city Statistical Divisions (SDs) were home to 12.7 million people, around two-
thirds (64%) of Australia's population.' (Australian Bureau of Statistics 2004)

This is not an assertion that Digital Media produces deliberately misleading reports. Rather, it is
evidence that Digital Media feels a pressure to present itself as a success within the greater
organisation. Survey results, and to some extent ratings reports, are interpreted so that average
results become good, and poor results become 'opportunities'.

27
User surveys gather an interesting type of audience feedback. It consists mainly of empirical data
such as age, gender and so forth; these measures are useful for assessing how well a site is
performing, or what sort of audience it attracts. However some measures such as open response
boxes, or queries about what site features users would like to see, show that Digital Media accepts
the need to be influenced by the audience – or at least to listen to what it wants.

User Testing

RedSheriff ratings and user survey data are invested with a certain type of claim to validity. By
virtue of the large size of their component samples, ratings and surveys claim statistical validity. It
must be noted that the validity required for action is often less to do with scientific appraisal, and
more to do with what intuitively makes sense. As long as the numbers look basically correct and
don't contain too many unwelcome surprises, they are usually acceptable.

This means that large audience samples may not always be necessary to get data that can be used
to take action. In the following case, Digital Media used a sample of just six audience members,
all of whom were recruited to take part in a website usability study. The feedback obtained in the
study was useful for reasons that had nothing to do with sample size, and everything to do with
the content of the feedback.

The production involved was SBS's World Tales project, in which twenty budding animators were
commissioned to interpret folk tales for the screen. The stories were gathered from many cultures
around the world. Being a multimedia production with a large online component, World Tales fell
under the auspices of Digital Media. The website itself (www.sbs.com.au/worldtales) is a
technically complex and deep production, housing all twenty animations and other related
material.

Before World Tales was launched in August 2004, the producers involved with the project decided
to undertake a user-testing study. The primary reason for this was to identify any mistakes in the
site, or hurdles that might interfere with the audience's use of the site. Although identifying such
problems was within the abilities of the production team, it was thought that 'fresh eyes' might
pick up problems more successfully.

Because of the limited budget for the study – which was limited to providing lunch and a
Cabcharge voucher to participants who had to travel to SBS – the test subjects were chosen from
a limited field. Three teenagers (two of these the children of staff) and three SBS employees from
outside Digital Media were chosen to take part. The participants were aged between sixteen and

28
mid-forties. Despite having a limited field from which to draw participants (co-workers and
children of staff), efforts were made to find people from multiple age groups, none of whom had
prior exposure to the website.

My colleague Cassandra Meers and I took charge of the user-testing process itself. One computer
was provided for each of the six subjects. A seventh PC hooked up to both a monitor and a data
projector served two purposes. As a demonstration machine, we used it to show participants parts
of the site. As a testing machine, it allowed us to provide participants a set of six tasks and to then
observe how they went about accomplishing those goals.

After providing the testers with a brief explanation of the site and showing an example animation,
we turned them loose and monitored their use of the site for about forty-five minutes. During that
time we asked each to come up to the demonstration machine and run through the list of
prescribed tasks.

At the end of the session several of the Sydney-based producers joined us in the testing room, with
two Melbourne-based producers joining in via a conference phone. Together we discussed the
website and some of the problems that participants had experienced during the testing process.
Some inconsistencies in the programming of page-scrolling buttons were discovered. Testers
reported that they found it difficult to know when they had reached the bottom of some pages, as
no visual indication (such as a scroll bar) was available.

Users also expressed some frustration about certain navigational elements, as well as the lack of
any indication about how long each animation lasted (such as a progress bar). Not everything was
a complaint – in fact, all six testers expressed a generally positive opinion about the site. This is
the kind of feedback which cannot come from RedSheriff. Even in user surveys, Digital Media
must first decide to ask this question before learning how users feel. Frustration is interesting
because it measures inaction, whereas most other feedback methods trace user actions.

Unfortunately only some of the problems could be rectified, mostly for practical reasons. Parts of
the site were subcontracted and could not be modified without further expense. Other elements
(such as more sophisticated scroll bars) had been tested but rejected due to technical
incompatibility between different internet browsers. Some changes would have simply required
too much effort to alter between testing and launch. One major paradox is that user-testing often
requires a reasonably complete product; subsequent changes can be too hard to deliver within
deadlines.

29
Can this user testing be thought of as an audience feedback process? The testers – an artificial
audience – were able to give the necessary feedback, and there were no obvious reasons why they
might not have formed part of the 'natural audience'. Therefore there is little reason to claim that
this testing is not a process of audience feedback.

It is also a process which, within the practical limitations described above, had a definite influence
on production practices. Just as ratings and surveys empower decision-making processes, user-
testing is useful for smoothing out the difficult parts of websites and improving their utility. Each
method helps Digital Media to judge the effects of its actions upon the audience. Feedback also
modulates the production processes of Digital Media staff in their pursuit of the best ways to
serve audiences. Digital Media is thoroughly involved with its audiences, both on the controlling
and the controlled end of audience relations.

30
Chapter Four : Public Relations

I have already examined the ways in which audience feedback is used to inform processes in SBS's
television and online areas. These processes include evaluation and justification of content,
commercial and government advocacy, and future content production decisions. Through the
previous chapters I have shown how SBS listens to and uses its audiences in production-related
contexts. In this chapter I will explore how SBS involves its audiences in tasks that are not
directly related to production. These may be thought of as processes of public relations, in the
general sense of relating ideas or purposes to the public.

The public I have defined is not a singular entity. It is a heterogeneous collection of people and
organisations with different types and levels of interest in SBS. It is another group of SBS
audiences, although not one which is addressed with standard broadcast content. Public relations
is the area in which staff at SBS are most directly involved in a dialogue with these audiences,
which are not always external. Internal public relations is a way of accomplishing a number of
inwardly-focused institutional goals.

Since these audiences have different associations with SBS, they must be listened to and addressed
in different ways. Although the methods may be various, the goal is always to 'wave the flag', a
belief expressed to me by Corporate Communications manager, Keith Dalton. The crisis of
knowledge suffered by SBS in relation to knowledge about the audience also works in the
opposite direction: audiences cannot know about SBS in any detail unless somebody feeds
information back to them. Information goes out to audiences from sources that may have nothing
to do with SBS.

SBS management perceives a need for audiences to 'get the right message' about SBS – that is, the
message which reflects institutional perspectives. This is because public and commercial
accountability are at the heart of SBS's ongoing operations. If SBS's public (advertisers, viewers,
bureaucrats and so on) does not perceive SBS as accountable for its actions (through engaging
with the public audience) then SBS may face difficult questions about its institutional practices.
Public relations processes seek to bring public perceptions of SBS into line with the broadcaster's
desired public image.

I will demonstrate several ways that SBS talks to its audiences, and how those processes seek to
accomplish institutional goals. I will show that the information disseminated by SBS all fits into a
grand narrative about the broadcaster, regardless of the audience to which it is directed. In other
words, messages from within SBS to various audiences may vary in approach, but always seek to

31
contribute to an overarching narrative of identity.

I will first examine the most comprehensive document to cover SBS's performance: the
corporation's Annual Report. As I write this (September 2004) the 2003-2004 Annual Report is
beginning to appear on desks around SBS. In its own words,

'This Annual Report details the programming, content and service provided by SBS Television, Radio
and Online. It also reports on SBS's relations with Government, the community and other stakeholders,
and the way in which SBS manages its human, financial and technical resources as well as its
transmission services.' (SBS 2004: iii)

Section 73 of the Special Broadcasting Service Act of 1991 sets out the requirements for
information in annual reports. 'In addition, [the 2004 report] assesses the Corporation's
performance against the goals of the SBS Corporate Plan 2004-06' (SBS 2004: iii). I will cover
only those aspects of performance which are supported by the use of the audience.

Annual reports tend to present information in the best possible light. Figures abound, as seen in
the 2004 report:

'- 800,000 Australians watched the SBS live broadcast of the Danish Royal wedding.' (page 1)
'- Radio staff produced 14,820 hours of individual language programs.' (page 23)
'- Traffic to the SBS website has increased 35% annually for the past four years.' (page 33)

The examples above leave it to the reader to interpret how 'good' the results are. They are
provided without context. This is not always the case:

'A record 5.7 million page impressions were recorded in June 2004.
SBS Online, which streams in 68 languages, is the world's most linguistically diverse website.
The World Game … is one of Australia's most popular sports websites.'
(page 33)

Either way, the statements above are supposed to encourage a positive feeling about the
Corporation's activities. In the areas of television, radio and online, annual reports are used by
SBS to tell success stories. Often this is accomplished by interpreting audience feedback such as
television ratings, but also by presenting other institutional decisions to the audience. The points
within the reports contribute to an institutional ideology: SBS the well-performing, valuable
broadcaster.

32
Similar to, but more targeted than, annual reports, is the 'SBS Review' newsletter. The periodical
document is targeted at some four hundred individuals who may be in some way involved with
SBS. The list includes every sitting Senator and Member of Parliament, media industry
personalities, Community Advisory Committee members and the members of the Federation of
Ethnic Communities' Council of Australia (FECCA) (Dalton, 2004).

The Review is sent out via email with story summaries and links to its website
(http://sbs.com.au/admin/sbsreview/newsletter.html). Dalton explained that the newsletter is
designed to raise SBS's profile within the community. The newsletter is also constructed to speak
to a wide range of audiences including possible sponsors, government figures and community
representatives. Using examples from the September 2004 newsletter, a sponsor might read about
the audience figures for the program John Safran Vs God. A FECCA representative might be more
interested in the section 'SBS Radio Reaches New Audiences', about the four new language
groups in SBS's radio stable (Malay, Somali, Amharic and Nepalese).

Naturally many of these pieces rely on interpreted audience feedback to make their point:

'In its first three weeks, [John Safran Vs God] (8.30pm Mondays) is averaging about 400,000 viewers
in the five mainland capital cities, with an additional 150,000 estimated audience in regional Australia.
It's attracting strong viewing from younger people with a 12% share for people 16-39 and a 17% share
for men 16-39.

SBS's reach also increased with 8.1 million viewers tuning in during the two weeks [of the Athens 2004
Olympic Games] – an increase of 24%. Importantly, this increase was most noticeable among younger
viewers with 90% of additional viewers under the age of 50 and 64% under the age of 35.' (SBS
Review, 2004-09-04)

Institutional uses of audiences are typically invisible to those audiences. Though they obviously
involve an audience, processes like the sale of advertising and the evaluation of programming
occur out of view of the general public. To a large extent these processes also involve a limited
institutional 'public'. That is, many audience-related processes within SBS take place relatively
unobserved. This is most likely because of the 'need to know' status of politically, commercially
or creatively sensitive processes.

A political example is the way that Digital Media obscures many of the traffic statistics for its
websites, as detailed in Part Three. These figures, while having utility for Digital Media
producers, are also a liability in that they may reflect unfavourably on how websites have been

33
designed or operated.

One of the most open and public ways in which SBS's management engages with its audiences (as
citizens) is in its handling of complaints against the broadcaster's programming content.
Programming on SBS is often controversial, as many sections of the Australian community have
polarised opinions on issues covered by SBS (not least of which is the Israeli-Palestinian conflict).
Most of the complaints received by SBS concern the content broadcast on television – much of
the time documentaries or news and current affairs programs, but also the occasional comedy
program, such as the often-confronting South Park or Pizza.

All Australian television stations have a responsibility to follow Codes of Practice in relation to
their program content, as established in the Broadcasting Services Act 1992. SBS develops its own
Codes of Practice, as laid out in Section 10(1)(j) of the SBS Act 1991, and notifies the Australian
Broadcasting Authority of these codes (ABA 2004). Thus it is that listening and responding to
complaints is a legislated requirement of SBS. However, the proper handling of audience
complaints is also fundamental to maintaining a healthy relationship with audiences.

The SBS Codes of Practice detail the broadcaster's processes for dealing with complaints. 2004
saw a comprehensive review by the SBS board of the broadcaster's complaints handling process.
According to the document 'Background to Complaints Handling Review', published by SBS’s
Policy Department:

'The complaints handling review began about a year ago in response to increasing public interest in the
handling of complaints and perceptions that SBS's system could be improved.'

The result of a lengthy review has recently (as of October 2004) resulted in several changes to the
complaints handling procedures in SBS's Codes of Practice. All of these demonstrate an
institutional commitment to the better management of audience complaints. The first major
change is the creation of the Audience Affairs Manager position (AAM). The AAM is now
'responsible for the coordination, investigation and determination of all formal complaints' (SBS
Policy, 2004).

Secondly, a new complaints-handling database – accessible via SBS's intranet – is the central
repository for all complaints made against the broadcaster. It is a simple tool for tracking
complaints and making sure that SBS's obligations under its revised Codes of Practice are met.
The database ensures that relevant staff in each division are made aware of new complaints and
that complaints are dealt with within the period specified by SBS.

34
It was discussed in the House of Representatives' Standing Committee on Communications,
Information Technology and the Arts, that SBS's complaints processes might possibly be moved
to an independent department:

'CHAIR—One of the views put forward is that perhaps the complaints review procedure or the panel
should be somewhat separate from the SBS board. Have you given any consideration to the members of
the complaints panel being independently appointed by the minister so that they are totally independent to
that of SBS?

Mr Milan [managing director of SBS]—We have not got that far. Most of our focus at the moment is
actually on improving our internal systems—to improve the transparency and, if you like, the
independence of the internal process. One of the flaws of the current system is that the program maker of
the program that is being complained about or the person that purchased the program and made the
decision to publish gets involved far too early in the process. So one of the things we are looking at
primarily is to improve the transparency and the independence of the internal process.' (Hansard, 2004-
02-11: 21-22)

There is obviously a difference between having 'members… totally independent' and improving
'independence of the internal process'. I would say that it is a matter of control. If a complainant
is not satisfied with SBS's handling of a complaint, s/he may refer the matter to an ABA review.
Therefore the complaints handling process is SBS's final opportunity to control its public
appearance.

An internal complaints process allows SBS (in the event of a breach of its Codes) to adopt the
position that it has corrected itself, while an external process may give the appearance that SBS is
taking direction. The difference is that making voluntary restitution, through apologies or
otherwise, leaves more of SBS's reputation intact than an enforced backdown delivered through
the ABA. Keeping the process internal allows SBS greater control over the manipulation of its
public image. Of course, SBS is occasionally confronted with a kind of audience feedback that has
great potential for damaging the broadcaster's public image.

On the 6th of October 2003, SBS began broadcasting the program Thoi Su (Vietnamese for
'Current Affairs'), 'a 35-minute bulletin produced by the Vietnamese Government-controlled
VTV4, from Monday to Saturday as part of its WorldWatch news and current affairs package'
(Sydney Morning Herald 2003-12-02). The reaction to this broadcast was unprecedented. Large
sections of the Australian Vietnamese community – a largely refugee community – were in uproar
about what they saw as hurtful propaganda being re-broadcast by SBS. Protests ensued.

35
Christopher Kremmer detailed the protesters' methods in his article 'Breaking the news at SBS'
(Sydney Morning Herald, 2003-12-20). These included bombarding SBS's telephone, mail and
email feedback channels with thousands of protests. Two rallies outside SBS's Sydney and
Melbourne headquarters in late October attracted an estimated crowd of twenty thousand.

This is an example of a rare time when SBS came into direct contact with its audience. Usually
the 'home viewers' are dealt with through abstractions such as ratings. In this case the audience
brought its message directly to SBS. And yet these protestors, an audience incarnate, were still
made known to SBS through their feedback. This is further evidence that everything emerging
from an audience is a form of feedback.

It was simply impossible for SBS to ignore the weight of negative audience feedback in this case.
Despite Mr Milan's assurances to a Senate estimates committee that 'from what we have seen,
there is nothing [in the program] that would breach our codes of practice' (Hansard, 2003-11-03:
6), and despite a small amount of support for the program from within the Vietnamese
community, Thoi Su was quickly removed from SBS's programming.

SBS apologised on account of the hurt caused by the broadcast, and for breaking a promise (made
by a previous head of SBS-TV) to consult with the Vietnamese community over the introduction
of such a news service (Hansard 2004-02-11: 3, 6, 18, 20). This last point may indicate a
bureaucratic error, but I conjecture that SBS probably anticipated the controversy that the
program would generate, and thus tried to avoid drawing too much attention to the broadcast.
This is simply an example of considering the audience when making a decision.

The outrage of the audience stimulated a debate, involving SBS and the wider community, about
the relative merits and drawbacks of the service. On the negative side, the presence of the
program on Australian television stimulated strong feelings of hurt within the Vietnamese
community. On the positive side, some members of the community were pleased to have any
news from their home country. Another issue was that many of SBS's World Watch programs
come from state-controlled (some might say propagandist) broadcasters, and are usually shown
without complaint. But SBS's concerns about its editorial independence were thoroughly
overruled by the strength of emotion present in the feedback against Thoi Su. It would be fair to
say that the broadcaster's hand was forced in the interests of repairing community relations.

The audience rarely has such a visible effect as this. But the intensity of feedback often correlates
with the strength of SBS's reaction to it. SBS is compelled by both legislative and practical
reasons to engage in these very direct forms of public relations. SBS always needs to justify its

36
programming decisions (and by extension, its public funding). The style of public relations
conducted by SBS is always geared towards maintaining and improving the broadcaster's public
image. In the words of Nigel Milan, 'SBS has grown because of popular demand and continues to
evolve to meet audience needs.' (Milan 2003) That is the heart of the message that SBS attempts
to deliver to its audience, the message that it wants to make its audience believe. Public relations
is all about listening to the audience's feedback, and using it to attempt to generate a desired
impression of SBS within that audience.

37
Chapter Five: Conclusions

It is obvious from my research that SBS is always engaging with its audiences. SBS speaks to its
audiences and listens to the feedback coming from them. This relationship is central to SBS's
ongoing institutional practices. It is a relationship which profoundly affects how SBS conducts its
core business of creating and broadcasting content to the general public. While dealing with
audiences is a complex and multi-layered task, one which consumes a great deal of energy and
resources within SBS, the knowledge gained empowers staff to act decisively when carrying out
tasks.

I have focused on examining the ways in which SBS engages with its audiences, and the effects of
these engagements on how SBS makes decisions. Before looking at the uses of audiences in more
detail, it is worth summarising what an audience is, and why audiences matter to SBS.

Audiences are only relevant if they can be perceived by SBS. Therefore audiences must be visible.
Audiences must also be intelligible. They cannot be dealt with if SBS cannot make sense of them.
Does this mean that the average SBS-TV viewer, sitting on his couch at home, is invisible and
irrelevant? To some extent, yes – unless that viewer happens to be part of the OzTAM ratings
panel. If he is not, then SBS's receiving feedback about his audience-ness is impossible (or at best,
highly unlikely). However the statistical power of a ratings panel works to make viewers visible
and intelligible as an aggregated quantity. Ratings are an audience surrogate; they are not 'whole
audience' feedback, but they make the television audience visible and intelligible.

All audiences, as SBS knows them, are therefore abstractions. An audience is described as a series
of specified variables. An audience is a trace of feedback patterns. Online surveys trace audience
responses to selected questions. Online and television ratings trace audience consumption
activity. Audiences are simply feedback which has been structured and classified. 'Such figures
produce a sense of concreteness, a sense of ontological clarity about who or what the ... audience
is.' (Ang 1991:34) In some cases, ideas of audience are built up without constant input from
empirical data. This is what I have referred to as 'gut feeling', a tool that is sometimes more
empowering than reams of pure statistics.

But why does SBS need audiences at all? Well, in some senses SBS could get by without thinking
about or referring to an audience. It is of course possible to create a program, or build a website,
without thinking about an audience. In reality, this kind of blindness immediately vanishes as
soon as simple concept such as 'target audience' is considered. 'The audience' is a quantity that is
always present somewhere in production processes.

38
Other vital tasks intrinsically require the creation and use of an audience. The sale of air time to
advertisers is one such task. Evaluation and justification of programming decisions never takes
place without reference to an audience, though it potentially could. And audiences are central in
maintaining a professional corporate image; the handling of audience complaints is an example of
this. In short, audiences are necessary for SBS to avoid existing in a virtual vacuum. Audiences
allow SBS to make judgements about the success of its core business, and to plan further action.

These are, therefore, the three main roles the audience plays at SBS:

1. Enabling evaluation

Examining audience feedback is the primary way that SBS evaluates its level of success. SBS needs
to evaluate successfulness because success is its main goal. The relative size of ratings and the
weight of positive against negative types of feedback are the main two measurements of success.
Often the level of discussion generated by content is taken into account. SBS acknowledges that
its programming may be contentious. Provocative or controversial material might generate much
negative feedback, but still be viewed within SBS as evidence for a program's success. The limits
of this phenomenon were demonstrated by the 'Thoi Su' controversy discussed in Chapter Four.

2. Stimulating creativity

SBS's charter allows a great deal of creative freedom in what the broadcaster produces. Decisions
about what to produce are greatly informed by intersections of the charter and knowledge about
the audience. SBS knows that certain television programming is likely to attract certain audiences.
Ideas about audiences allow SBS to make educated guesses about the likely outcomes of
particular strategies. Producers creating online content invest a lot of thought in considering
usability, which is actually a consideration of how audiences will use a website. Audience feedback
also allows changes to be made to ongoing productions, usually with the goal of improving what is
being offered to the audience. The changes to the World Game site, discussed in Chapter Three,
are an example of how an audience shaped production.

3. Allowing SBS to conduct external relations

'External relations' means two things. Firstly, audiences allow SBS to conduct some kinds of
commercial activity, such as the sale of advertising space. This is mainly done with the assistance
of ratings, but there exist a few other measures, such as Newspoll surveys, which can be used for

39
this purpose. SBS is effectively selling its audience to advertisers.

Secondly, audience feedback allows SBS to undertake tasks which shape the broadcaster's public
image. Some of these are the processes covered in Chapter Four, including annual reports and
complaints handling. In these cases SBS uses its audience as a tool to modify its preferred public
image. Audiences that are used for public relations frequently overlap with audiences used for
evaluative purposes, with the key difference that public relations overwhelmingly highlights SBS's
success stories.

These three roles of the audience are all involved with the economy of control surrounding SBS's
audience relations. Producers, through the agency of their programs, attempt to have some effect
upon the audience. This effect is nominally social, according to the charter: 'to inform, to
educate, to entertain'. Audience feedback is the method overwhelmingly used by producers to
figure out how well this effect is being achieved – to determine the effectiveness of their presumed
control over audiences. A less common consideration, but a very important one, is that audience
feedback also exerts control over producers. This may be unintentional or deliberate on behalf of
the audience. For instance, ratings audiences are probably not attempting to change how
producers act. On the other hands, complainants and survey respondents may be trying to alter
SBS's production processes. In either case the outcome may be that producer behaviour is
changed.

The type of control evident in these processes is 'manipulation by proxy' – neither audiences nor
producers have direct control over one another. Producers attempt to act on the audience
through the agency of their productions. Audiences, consciously or not, attempt change through
feedback. Neither is this control total. Producers are free to ignore audience feedback, and
audiences are free to adopt whatever viewing behaviour they find appropriate.

And the project of control is never complete, but takes place daily within SBS. It is a constant
battle to deal with audiences; they can be fickle, slippery and difficult to understand. But in the
end that is exactly SBS's task: to attempt to fulfil its institutional goals, as laid out in the Charter,
by engaging with its audiences in ways that are practical and useful.

40
References

- Australian Broadcasting Authority (2004): ‘ABC and SBS Codes of Practice’


(http://www.aba.gov.au/tv/content/codes/national/index.htm accessed 7 October 2004)
- The Age (November 9 2003): 'SBS in Strife'
(http://www.theage.com.au/articles/2003/11/08/1068243304036.html accessed October 10 2004)
- Ang, Ien (1991): 'Desperately Seeking The Audience'. Routledge, London.
- ATR Australia (2004): ‘Who is ATR?’ (http://www.atraustralia.com/docs/whoisATR.html accessed
8 September 2004)
- Australian Bureau of Statistics (2003): 'Australian Social Trends 2003: Population – Population
Distribution: Population Characteristics and Remoteness'
- Australian Bureau of Statistics (2004): '3218.0 Regional Population Growth, Australia and New
Zealand'
- Broadcasting Services Act (1992): http://scaleplus.law.gov.au/html/pasteact/0/136/top.htm accessed 7
October 2004
- Dalton, Keith (2004): Interview with author
- Deleuze, Gilles (1990): 'Postscript on the Societies of Control'
(http://www.n5m.org/n5m2/media/texts/deleuze.htm accessed 31 October 2004)
- Gunter, Barrie 'The quantitative research procees' in Bruhn-Jensen, Klaus (2002): 'A Handbook
of Media and Communication Research'. Routledge, London.
- Hansard (11 February 2004): House of Representatives' Standing Committee on
Communications, Information Technology and the Arts.
- Hansard (3 November 2003): Senate Environment, Communication, Information Technology
and the Arts Legislation Committee (Estimates).
- Hansard (February 16 2004 ): Senate Environment, Communications, Information Technology
and the Arts Legislation Committee (Estimates)
(http://www.aph.gov.au/hansard/senate/commttee/S7308.pdf, accessed October 10 2004)
- Harcourt, Emma (2004): Correspondence with author
- Harcourt, Emma and Wong, Derrick (2003): 'Who is our online audience?', internal SBS report
- Maley, Mark (March 2004): interview with author
41
- Meers, Cassandra (2004): 'Who is our online audience in 2004?', internal SBS report
(cont.)
- Milan, Nigel (26 November 2003): 'SBS, and what it does - Speech to the Consular Corps,
Sydney' (http://www.sbs.com.au/sbscorporate/index.html?id=856 accessed 25 October 2004)
- Millward, Emelia (2004): Email correspondence and interview with author
- Nightingale, Virginia (1985): 'What's Happening to Audience Research?', Nepean College of
Advanced Education, Westmead
- Quirke-Parry, Pat (2004): Email correspondence with author
- RedSheriff (2004): 'Frequently Asked Questions'
(http://www.redsheriff.com/au/content/faqs_1.html accessed 2004-08-25)
- SBS Annual Report (2003)
- SBS Annual Report (2004)
- SBS Policy Department (2004): ‘Background to Complaints Handling Review’
- SBSin 2004(1): http://www.sbs.com.au/SBSin/content.php3?id=19 accessed 13 October 2004
- SBSin 2004(2): 'SBS Television 2004' (http://sbs.com.au/media/8592Jan_2004_Update.ppt
accessed 13 October 2004)
- Special Broadcasting Service Act 1991 (http://scaleplus.law.gov.au/html/comact/7/3867/top.htm
accessed 13 September 2004)
- Sydney Morning Herald (2 December 2003): 'Crunch time for SBS over Vietnamese news
bulletin'(http://www.smh.com.au/articles/2003/12/01/1070127351359.html accessed 10 October
2004)

42
Index to Appendix

SBS Charter: 44

Technology and Distribution Campaign Eight: 45

Dateline User Survey: 47

43
SBS Charter

SBS was established as an independent statutory authority on 1 January 1978 under the
Broadcasting Act 1942. The Special Broadcasting Service Act 1991 (C'wlth) which came into
effect on 23 December 1991, established SBS as a Corporation. This Act gives SBS a clear charter
setting out what the Australian people through the Parliament require of SBS as a national
broadcaster. The functions which Parliament has prescribed for SBS are set out in the Charter of
the Corporation (section 6 of the Special Broadcasting Service Act 1991) and are:

The principal function of SBS is to provide multilingual and multicultural radio and television
services that inform, educate and entertain all Australians, and, in doing so, reflect Australia's
multicultural society. SBS, in performing its principal function, must:

(a) contribute to meeting the communications needs of Australia's multicultural society, including
ethnic, Aboriginal and Torres Strait Islander communities; and

(b) increase awareness of the contribution of a diversity of cultures to the continuing development
of Australian society; and

(c) promote understanding and acceptance of the cultural, linguistic and ethnic diversity of the
Australian people; and

(d) contribute to the retention and continuing development of language and other cultural skills;
and

(e) as far as practicable, inform, educate and entertain Australians in their preferred languages; and

(f) make use of Australia's diverse creative resources; and

(g) contribute to the overall diversity of Australian television and radio services, particularly taking
into account the contribution of the Australian Broadcasting Corporation and the public
broadcasting sector; and

(h) contribute to extending the range of Australian television and radio services, and reflect the
changing nature of Australian society, by presenting many points of view and using innovative
forms of expression.

44
TECHNOLOGY & DISTRIBUTION DIVISON
CAMPAIGN MEASURES

Campaign Eight
Maintaining and growing our strengths in digital media creation

KRA 1
The Division will grow its online audience.

Performance Measure
1. Increasing the number of people visiting our websites, with specific focus on the Radio website, The
World Game site, The World Tales site, and My Space site, sbs.com.au.

2. Planning and developing quality content with TV and Radio.

3. We will benchmark the number of areas with multimedia extensions to their traditional content.

4. Increasing the promotion of SBS New Media through internal and external outlets.

Performance Indicator
1. By December 04 we will achieve a 25% increase in traffic to aggregate SBS domain.

2. Monthly, we will achieve minimum increase in traffic of 10% throughout 2005.

3. 3 external press mentions per month.

4. By August a systematic program of consultation with TV and Radio programming areas will be in place.

5. By June 05 we will grow benchmark of multimedia content extensions to traditional media by 15%.

6. By September we will provide headline news and sport services made available to third party websites.

KRA 2
The Division will develop a sustained online community of loyal visitors.

Performance Measure
1. Increasing the number of people collaborating and interacting on our sites.

2. Increasing the number of user participation channels (eg forums, polls, votes) on our sites.

3. Increasing the number of registered users to our email database and extending the number of online
email services.

4. Responding to user feedback and measuring user satisfaction through online surveys.

Performance Indicator
1. One new user participation channel per month.

2. A minimum monthly increase in number of responses to participation elements of 10%.

3. Regular increases in the size of email registration databases.

4. One new email product delivered bimonthly.

45
5. By August we will complete 2004 Online Audience survey and commence review of feedback from
online users.

KRA 3
The Division will extend its content offer through interactivity.

Performance Measure
1. Increasing the interactive channels available to us, including;
a. SMS
b. Video
c. Publishing, Performance
d. Retail

1. Increasing the types of interactivity available to the audience.

Performance Indicator
1. By December 04 we will deploy 7 new SMS services.

2. By December 04 we will deploy 4 interactive applications for TV and Radio programming.

3. By December 04, we will enhance two other programs through SBS Essential, in addition to the UEFA
soccer tournament.

46
Dateline Survey - http://www.sbs.com.au/survey/dateline.html (3 November 2004)

1. How often do you visit this site?

At least weekly
A least monthly
Less often
This is my first visit

2. What led you to visit the Dateline website today?

I am a regular visitor
I followed a link from another SBS website - please specify which site
I followed a link from another website - please specify which site
I found this page though a search engine - please specify below
I saw this website advertised on TV
A friend recommended it to me

3. Please rate the following statements about the Dateline website:

a) I find the website engaging

strongly agree
mildly agree
neutral
mildly disagree
strongly disagree
none of the above/don’t know

b) I think the website is informative

strongly agree
mildly agree
neutral
mildly disagree
strongly disagree
none of the above/don’t know

c) The website content is up-to-date

strongly agree
mildly agree
neutral
mildly disagree
strongly disagree
none of the above/don’t know

d) The website doesn't provide me with the information I am looking for

strongly agree
agree
neutral
disagree
strongly disagree
none of the above/don’t know

e) The website is slow to access

strongly agree
agree
neutral
47
disagree
strongly disagree
none of the above/don’t know

f) It's hard to find things on the website

strongly agree
agree
neutral
disagree
strongly disagree
none of the above/don’t know

4. Where do you access the Dateline website?

Home
Office
University
Other (library, internet cafe etc.)

5. When you visit this site, do you view: (tick for yes)

This week's upcoming stories


Text transcripts from the Dateline archives
Video footage from the Dateline archives
'Have your say'

6. What is your main purpose for accessing Dateline's archives?

Research purposes
Catching up on a missed episode
General interest in current affairs
I don't use Dateline's archives

7. How long do you usually spend on the Dateline website?

Under one minute


Between 1 and 2 minutes
2 - 3 minutes
3 - 5 minutes
Over 5 minutes

8. Do you watch Dateline on SBS Television?

Yes
No

9. What other current affairs television programs/websites do you view/access?

(open response box)

10. How likely is it that you could recommend this site to a friend or colleague?

Very likely
Maybe
Uncertain
Probably not
Definitely not

48
11. What do you like about Dateline website? What improvements would you like to see?

(open response box)

12. What is the current local time, to the nearest hour?

(time selection boxes)

13. What is your gender?

Male
Female

14. Which state are you in?

(Drop down menu of Australian states and territories. Option 9 is ‘Outside Australia’.)

15. Are you living in:

A capital city
A regional area

16. What age group are you in?

Under 18
18 – 24
25 – 39
40 – 54
55 and over

17. Which of the following best describes your occupation?

Professional
Clerical
Skilled manual/Trades
Self-employed
Student
Other

18. Do you speak a language other than English at home?

No - English only
Yes - Italian
Yes - Greek
Yes - Cantonese
Yes - Mandarin
Yes - Arabic
Yes - Vienamese
Yes - Other

19. Would you like to submit your email address to our database of online users? You can choose
how you would like your address to be used.

(Box for email address)

Select if you would like to participate in future SBS research


Select to be contacted about new SBS products and services

(Button: ‘Submit’)

49

You might also like