You are on page 1of 9

D a v i d M.

D o z i e r

Program Evaluation
And The Roles Of
Practitioners
Pressure from several corners is forcing public relations practitioners to think about ways
to measure and evaluate public relations impact. Professional associations and publications
are providing guidance to practitioners seeking to measure their activities while educators
are zvorking to introduce evaluative research instruction to communications curricula.
Perhaps the strongest call for redefining public relations, according to this author, was
made by Edward ]. Robinson, who said that the practitioner of the future will be "an
applied social scientist.'" He sees public relations as moving away from "'seat-of-the-pants"
approaches and toward what he terms "scientifically derived knowledge."
The author of this article attempted, through a survey conducted in 1981, to determine to
what extent today's public relations professionals operate as "seat-of-the-pants'"
communicators or as professionals increasingly dependent on scientific techniques--or some
combination of both.
Dr. Dozier is an assistant professor in the Department of Journalism, San Diego State
University. He gratefully acknowledges the help of Dr. Glen Broom, head of SDSU's public
relatiot,s emphasis. Funds for Dr. Dozier"s survey were provided through a grant from the
San Diego State Univ.ersity Foundation.

W hat's happening among practitioners in response to calls for mea-


suring and evaluating their public relations efforts? A 1981 survey of
professionals indicates that three styles of program evaluation are emerg-
ing. Two styles, one "scientific" and the other "seat-of-the-pants," are used
by manager-type practitioners. Technical-type practitioners, on the other
hand, adopt no particular style of program evaluation.
A sampling frame for public relations practitioners in San Diego, Calif.,
was developed through the cooperation of four professional organizations:
Public Relations Society of America, International Association of Business
Communicators, Public Relations Club of San Diego and Women in Com-
munications. Questionnaires were mailed to 333 subjects, obtained from
these organizations' mailing lists in February 1981. Non-respondents were
mailed a second questionnaire in March. A total of 169 questionnaires was
completed and returned for a 50.7 percent response rate.

13
Public Relations Review
The study of professionally-affiliated practitioners in a single community
cannot be said to represent the national population of practitioners in a
statistical sense. However, subjects who were members of the Public Rela-
tions Society of America were compared with a national study of PRSA
members. Cross checks of key variables indicated that San Diego practi-
tioners were very similar to their national counterparts. 1
Factor analysis was used to identify underlying "styles" of evaluation
among the 12 activities measuring the focus and methods of evaluation. 2 If
two different evaluation activities vary together (practitioners who fre-
quently do one activity also frequently do the other activity), then they are
both indicators of an underlying "style" or approach to evaluation. Called
factors, these clusters of activities are then examined to identify character-
istics they have in common. These shared characteristics point to an under-
lying type. 3

Results
Three major clusters of evaluation activities emerged from the factor
analysis, pointing to three major "styles" of public relations evaluation
among practitioners in the study. 4 The three clusters, or factors, were
studied and labeled as the scientific impact style, the seat-of-pants style and
the scientific dissemination style of evaluation.

The first factor or cluster of activities to emerge from the analysis included
four evaluation activities with high factor loadings. A factor loading may
be viewed as a score (ranging from zero to one) which shows the relation
between the activity actually measured and the underlying "style" of eval-
uation that all activities in that cluster share. The higher the factor loading,
the better that particular activity measures that underlying "style" or approach
to evaluation.
In Table 1, the four items that best measure the scientific impact style of
evaluation are displayed. As indicated by the focus column in the table,
this style includes both preparation and impact evaluation. As indicated by
the methods column, all four activities reflect a scientific approach. The two
impact activities are quantified measures of public reactions to the organi-
zation and its public relations program. The two preparation activities
involve attempts to anticipate public reactions to various messages of a public
relations program before the program is implemented. In short, all four
activities involve efforts to evaluate public reactions to the organization and
its public relations program, two of them occurring before implementation
and two of them occurring during or after implementation.
The second factor or cluster of activities emerging from the factor analysis
included four items with high factor loadings. As indicated by the focus
column in Table 2, the three focus areas of preparation, dissemination and
impact are included in this cluster of evaluation activities. The item with
the highest factor loading involves unsystematic evaluation of dissemina-

14
Program Evaluation and Practitioners

TABLE 1
Factor 1: Scientific Impact Evaluation
Factor Method Focus Description
Loading
.77 scientific impact I check PR impact through interviews with a
scientifically selected cross-section of
significant publics.
.69 scientific preparation I prepare communications by testing
preliminary message strategies and formats
on focus groups drawn from publics
involved.
.68 scientific preparation I prepare communications by first reviewing
relevant published surveys (Gallup, Harris,
Field, etc.) on attitudes of publics involved.
.57 scientific impact I check PR impact through ongoing counts
tabulated of public complaints by phone or
letter.

TABLE 2
Factor 2: Seat-Of-Pants Evaluation
Factor Method Focus Description
Loading
.57 S-O-P dissemination I monitor dissemination of messages (news
releases, etc.) through my close personal
contacts among mass media professionals.
.54 S-O-P impact I check PR impact by attending meetings
and hearings of groups representative of
significant publics.
.54 S-O-P preparation I check communication strategies during
preparation by reviewing them with
practitioner colleagues, who apply their
own professional standards.
.53 S-O-P impact I check PR impact by keeping my eyes and
ears open to the reactions of my personal
and public contacts.

15
Public Relations Review
tion of public relations communication, while the item with the third highest
loading is the subjective evaluation of messages during preparation. The
remaining items involve informal and subjective techniques for evaluating
impact. Significantly, these four activities share a common dependence on
"seat-of-the-pants" methods. None of the activities clustered here involves
quantitative or social scientific techniques.
The third factor or cluster of evaluation activities included only two items
with high factor loadings. As displayed in Table 3, both items are measures
of dissemination efforts and both items involve quantitative or scientific
techniques for completing those measures. The scientific measure of dis-
semination appears to be a specialized area of evaluation, a specialization
not shared in either preparation or impact evaluation.
Every subject in the study was given a score on each of the three styles
of evaluation, based on how frequently he or she engaged in activities that
make up each cluster or style of evaluationP
In addition to scores for each of the three evaluation styles, all sul~jects
were given a score on the two major organizational roles. The 24 activities
measuring organizational roles played by public relations practitioners were
derived from a 1979 study of practitioners drawn from the membership list
of the Public Relations Society of America. Broom's study of practitioner
roles identified activities indicative of different organizational roles from an
extensive review of the literature. In 1981, Broom's data were analyzed
again using factor analysis to identify clusters of activities that point to
underlying role types played by practitioners. 6

TABLE 3
Factor 3: Scientific Dissemination Evaluation
Factor Method Focus Description
Loading
.89 scientific dissemination I monitor the dissemination of messages
(news releases, etc.) through a
comprehensive clip file and log of inches
placed, reach, and other vital statistics.
.60 scientific dissemination I monitor dissemination of messages
(news stories, editorials, letters to editors)
through a formal, ongoing content
analysis of items in the clip file.

Two major roles emerged from this analysis. Full details of the secondary
analysis are reported elsewhere. 7 The two major roles that emerged were:
Communication Manager Role--In this role, the practitioner is regarded
as the organization's expert on solving public relations problems.

I6
Program Evaluation and Practitioners
Practitioners playing this role make communication policy decisions
and implement a systematic PR planning process. At the same time,
practitioners are also held accountable for the success or failure of PR
efforts. Practitioners with top scores on this scale generally hold man-
agement positions, have about 11 years experience in public relations
and earn about $33,000 annually.
Communication Technician Role--In this role, the practitioner is immersed
in production of brochures, pamphlets, photographs and graphics.
The practitioner writes news releases and handles technical aspects of
producing PR materials. Practitioners in this role implement decisions
made by others. Communication technicians play no part in decision
making, nor are they held accountable for public relations successes
or failures. Practitioners with top scores on this scale generally hold
staff positions, have about eight years of public relations experience
and earn about $22,000 annually.
Broom's 1979 data were used to develop a communication manager role scale
and a communication technician role scale. These scales were then used on
the data of this study to compute manager scores and technician scores for
each subject in the study. 8

Hypotheses
With three "styles" of public relations evaluation identified, hypotheses
linking these "'styles" to practitioner organizational roles were developed.
Druck and Hiebert provide a model of professional development in public
relations that served as the base for hypotheses linking career levels (orga-
nizational roles) to evaluation activities. 9
The communication manager seemed likely to adopt evaluation innova-
tions, based on implications of the Druck and Hiebert model of career
levels. ~~ In the role of problem solver, decision maker and planner, the
managerial practitioner may view evaluation as advantageous and compat-
ible with other aspects of the role. As the organization's public relations
expert, practitioners playing the managerial role are expected to be know-
ledgeable about innovations in public relations and expected to demonstrate
leadership in new approaches to old problems.
Communication technicians, on the other hand, would likely see little
relation between styles of evaluation and their organizational roles. In the
Druck and Hiebert model, the "beginning professional" closely resembles
the communication technician of this study. Evaluation activities are not
emphasized at this career level, though the "beginning professional" is
expected to "undertake basic research" in the preparation of public relations
materials." Implications of the model suggest that no relationship would
exist between communication technician role scores and scores on the
various styles of evaluation.

17
Public Relations Review
The following hypotheses were put to test:

hi: Communication manager role scores are significantly and positively


correlated with the scientific impact style of evaluation.
h2: Communication manager role scores are significantly and positively
correlated with the seat-of-pants style of evaluation.
h3: Communication manager role scores are significantly and positively
correlated with the scientific dissemination style of evaluation.
h4: Communication tedmician role scores are not significantly correlated
with the scientific impact style of evaluation.
hs: Communication technician role scores are not significantly correlated
with the seat-of-pants style of evaluation.
h6: Communication technician role scores are not significantly correlated
with the scientific dissemination style of evaluation.

Pearson product-moment correlation coefficients and significance levels


were computed for each of the six relationships. Magnitudes of relation-
ships were tested for statistical significance. 12

Results
The correlation coefficients and significance levels for the six relationships
are provided in Table 4. As indicated, hypotheses h , h2, h5 and h6 were
confirmed. However, h3 was rejected, which predicted a significant rela-
tionship between communication manager scores and the scientific dissemina-
tion style of evaluation. No significant relationship was found.

TABLE 4
Correlation Coefficients for Organizational Role Scores and Evaluation Styles
Communication Communication
Manager Role Technician Role
Scientific .26 - .02
Impact p K .01 N.S.
Evaluation N = 113 N = 116
Seat-Of-Pants .54 - .00
Evaluation p < .01 N.S.
N=112 N=115
Scientific .13 .10
Dissemination N.S. N.S.
Evaluation N = 115 N = 118

18
Program Evaluation and Practitioners
As predicted, no style of public relations evaluation is related to the
communication technician role scores. This study provides evidence that the
role of communication technician operates independently of any identified
approach or style of public relations evaluation. As predicted, the commu-
nication manager role scores are positively correlated with both the scientific
impact and the seat-of-pants styles of evaluation. Practitioners with high
communication manager scores also have high scores on the scales measuring
scientific impact and seat-of-pants styles of evaluation. However, contrary to
prediction, high communication manager scores are independent of scores on
the scientific dissemination style scale. Further analysis indicates that only
practitioners who specialize in media relations are heavy users of the sci-
entific dissemination style of evaluation. ~3
Several major implications of the study should be considered. First,
three major styles of public relations evaluation are emerging among public
relations practitioners studied. One style, the seat-of-pants style, emphasizes
personalized and subjective checks of all parts of the public relations proc-
ess. Another style, the scientific impact style, stresses quantitative or scientific
measures of program impact, both before and after program implementation.
Finally, some practitioners specialize in numeric analysis of their clip files.
Nothing stops practitioners from adopting more than one style of evalua-
tion. In fact, high-level manager practitioners have adopted several styles.
These emerging styles of program evaluation hold important implications
for the public relations profession.
These implications become clear when styles of evaluation are matched
up with the types of practitioners who use them. Most interesting, and
most unexpected, the common practice of judging public relations success
through clip file audits has become pass~ among high-ranking, top-paid
managerial practitioners. Instead, these practitioners use both their own
intuition and gut reactions, as well as scientific techniques, to continually
check on their public relations programs. This is counter to the evolution
outlined by Robinson. Robinson argues that the "seat-of-the-pants" approach
to problem solving would be slowly replaced by more rigorous scientific
approaches to evaluation and problem solving. ~4The evidence here shows
that scientific impact evaluation serves to supplement rather than replace
"seat-of-the-pants" approaches to problem solving and evaluation.
This supplemental use of scientific evaluation is consistent with the imper-
ative that managers make decisions and act even when all the information
useful to decision making is not available. Scientific evaluation research is
not always completed in a timely fashion.15 Sometimes the cost of a rigorous
scientific evaluation is prohibitive? 6 Under such circumstances, managerial
practitioners act on the basis of the best information available, including
their own hunches and intuition. While one can imagine a movement away
from intuition and subjective evaluation in the abstract, evidence here
indicates that in the real world of public relations, the practitioner who

19
Public Relations Review
manages the public relations effort uses multiple styles of evaluation, both
subjective and objective, both "seat-of-the-pants" and "scientific."

Most disturbing, perhaps, is the finding that practitioners playing tech-


nical roles in public relations are not likely to adopt any style of evaluation
at all. If public relations is viewed as purposive communication and action--
as a goal-directed activity to maintain or change relations between an orga-
nization and its priority publics--then all practitioners must see evaluation
as integral to the process. The technical practitioner who writes the news
release or designs the brochure, w h o handles graphics and production,
must do so with an eye to its purpose. If not, the use of various evaluation
techniques by high-level practitioners to check and refine the public rela-
tions program may have little impact on the day-to-day activities of lower-
level staffers.
One might even hypothesize that program evaluation represents an
important schism between the high-level, well-paid managerial practitioner
and the lower-level technical practitioner. Concern over this schism is rein-
forced by the finding that the technical practitioner is not an entry-level
staffer playing a transitory role. Rather, technicians in this study averaged
37 years of age with nearly eight years of professional public relations
experience. These age and experience characteristics, while lower than
those for managers, were not significantly different. Contrary to the career
level path outlined by Druck and Hiebert, some practitioners seem to opt
for the technical role as a relatively permanent career choice. One charac-
teristic that typifies such practitioners is the total absence of an evaluation
style, either "seat-of-the-pants" or "scientific."

References
1Comparisons were made between PRSA members in this survey (N = 65 or 38% of the
sample) and a national survey of PRSA members in 1979. A larger number of San Diego PRSA
members work for non-profit organizations (46% in San Diego vs. 30% nationally). Regarding
income, years of practitioner experience, tenure at present iob, media experience, age, edu-
cation and male domination, San Diego PRSA members closely resemble characteristics of the
national PRSA members. Because other professional organizations were included in this study,
the overall sample included more women, was younger, earned lower salaries, and had fewer
years of practitioner experience than the national PRSA sample. This represents differences
in the makeup of the professional organizations. The San Diego sample can be inferred to be
roughly representative to a similar national sample drawn from the same organizational
membership lists.
2Norman H. Nie and others, Statistical Packagefor the Social Sciences, second edition (New
York: McGraw-Hill, 1975), pp. 468-514.
3R. J. Rummel, Applied FactorAnalysis (Evanston: Northwestern University Press, 1970), pp.
19-21.
4The principal factoring with iterations method was used to extract initial factors. Three
factors with eigen values greater than 1.0 were rotated to a varimax solution. The three factors
accounted for 57.2% of the variance among the 12 indicators of evaluation activities.
SFactor score scales were constructed by weighting the normalized score for each relevant
variable by its factor score coefficient and adding all such weighted, normalized variables
together. Factor score scales are weighted, additive scales.

20
Program Evaluation and Practitioners
6From a 1979 national survey of PRSA members conducted by Dr. Glen M. Broom, 355
respondents were selected and their responses to the 24 measures of organizational roles were
analyzed using principal factoring with iterations rotated in a varimax solution. Excluded from
the analysis were all practitioners working for advertising or public relations agencies. A
separate factor analysis for these outside practitioners yielded a complex, six-factor solution,
indicating that outside consultants play different organizational roles within a client's orga-
nizational structure than do internal practitioners. Among inside practitioners, the four roles
emerging from the factor analysis were interpreted as communication mana'ger, communication
technician, media relations specialist and communication liaison roles. The major r61es, communication
managerand communication technician, which are analyzed in this article, acc~ountfor the majority
of variance among the 24 indicators of organizational roles. "~.
ZDavi,d, M. Dozier, "The Diffusion of Evaluation Methods Among Public Relations Practi-
tioners, paper presented to the Public Relations Division, Associatien for Education in Jour-
nalism Annual Convention, East Lansing, Mich., Aug. 9, 1981.
8As with the scales for style of evaluation, the communication manager role scores and the
communication technician role scores were computed from factor score coefficients multiplied
by the normalized score for the relevant variables. The score was generated from a weighted,
additive scale.
9Kalman B. Druck and Ray E. Hiebert, Your Personal Guidebook To Help You Chart A More
Successful Career in Public Relations (New York: Public Relations Society of America, 1979), pp.
26-54.
Z~ communication manager role in this study most closely parallels the "senior professional"
in the Druck and Hiebert career level model.
"Ibid, p. 18.
~2A confidence level of .99 was established prior to the test. A relationship was considered
significant only if there was a 99% chance that the relationship discovered in the sample also
existed in the population of practitioners.
~3The media relations specialist is a minor organizational role performed by some practitioners
with above-average media experience. This role involves maintaining media contacts, press
release placement, informing others in the organization of what the media report about the
organization. Practitioners with high scores on this scale make only slightly more than com-
munication technicians ($23,000 vs. 522,000), average 39 years old and mostly occupy staff
rather than managerial posiUons.
~4Edward J. Robins, Public Relations and Stlrvey Research (New York: Appleton-Century-Crofts,
1969), p. 13.
~SScott M. Cutlip and Alien H. Center, Effective Public Relations (Englewood Cliffs, N.J.:
Prentice-Hall, 1978), p. 214.
161bid.
~7lndeed, when analysis of variance was conducted to see if practitioners predominantly
playing one role were different in age and professional experience than practitioners playing
other roles, no significant differences were found. While communication technicians were
slightly younger and had fewer years of public relations experience than communication
managers, these differences are so slight in the sample that no true differences can be reason-
ably assumed to exist among the larger population of practitioners. (For age, F = 0.75, d.f. =
3, 151, significance = 0.52. For years of professional PR experience, F = 1.41, d.f. = 3, 152,
significance = 0.24).

2I

You might also like