You are on page 1of 181

A Primer on

Communication and
Media Research

Editor:
Professor Fernando dlC. Paragas, PhD

Authors:

Associate Professor Julienne Thesa Y. Baldo-Cubelo, PhD


Assistant Professor Jon Benedik A. Bunquin, MA
Associate Professor Jonalou S.J. Labor, PhD
Assistant Professor Ma. Aurora Lolita Liwag-Lomibao, MA
Professor Fernando dlC. Paragas, PhD
Professor Elena E. Pernia, PhD
Associate Professor Ma. Rosel S. San Pascual, PhD
Assistant Professor Randy Jay C. Solis, PhD
Professor Violeda A. Umali, PhD

ii
A PRIMER ON COMMUNICATION AND MEDIA RESEARCH
Version 1  Released online on 18 January 2021

Copyright © 2021 UP CMC Department of Communication Research

All rights reserved. This manuscript is NOT FOR SALE. It may be reproduced, distributed, or transmitted in any
form or by any means, including photocopying, recording, or other electronic or mechanical methods FOR
EDUCATIONAL PURPOSES ONLY. Any part of this manuscript may not be used or excerpted in an academic,
commercial, non-commercial, or trade publication, in print, electronic, online, or all other formats, without
prior, explicit, and written permission from the UP CMC Department of Communication Research.

Editing and layout by Professor Fernando dlC Paragas, PhD


Cover design by Assistant Professor Jon Benedik A. Bunquin, MA

Department of Communication Research


2F, Annex Building, Plaridel Hall
College of Mass Communication
University of the Philippines Diliman

Ylanan Road, UP Diliman Campus


1101 Quezon City, Philippines
Tel: (632) 9818500 loc 2665
Email: communicationresearch.upd@up.edu.ph

iii
Prologue
ABOUT THIS PRIMER
The UP CMC Department of Communication Research, as a Commission on Higher Education Center of
Excellence in Communication since 1999, constantly endeavors to improve scholarship on Philippine
communication and media. Through its Salik Research Hub, the department has started to implement a
comprehensive and integrated program of instruction, research, and extension activities.

Among the key initiatives of Salik is the development of the HANDBOOK ON COMMUNICATION AND MEDIA
RESEARCH IN THE PHILIPPINES which is now in the penultimate stage of writing by our Department’s faculty
members. The Handbook exclusively employs studies on Philippine communication and media as the cases
which exemplify specific lessons and serve as foundations for various learning activities.

Informed by the principles of and approaches in Outcomes-Based Education (OBE), the Handbook engages
learners with substantive content through the following components:

a. Learning outcomes and key questions—Chapters and sections clearly articulate expectations and
guideposts as regards what students must learn.

b. Discussion of core concepts—Chapters draw from the literature to explicate foundational concepts in
communication and media research. Chapters present, where appropriate, historical contexts, important
personalities, and Philippine applications as well as engage critically important issues in communication
and media.

c. Formative exercises—The discussion of key topics is structured and segmented to facilitate “We do, you
do, and I do” learning. Accordingly, in the “we do” stage, such a discussion begins with the use of one
study as an exemplar case which students and their teachers collectively use to examine otherwise
abstract concepts in a specific lesson. The discussion then continues to the “you do” stage where students,
individually or in groups, then apply the lesson’s concepts in a guided project.

d. Summative applications—For the “I do” stage, students create independently a project which
demonstrates their holistic understanding of the lessons embedded in the previous formative exercises.

e. Glossary and references—The Handbook contains a thorough inventory of definitions and source
materials for various topics in communication and media research.

The release of the Handbook, even as it now enters pre-press work, is still going to take over a year before it
sees publication.

Because of the urgent need for local and accessible learning resources on communication and media—a long-
standing concern that has been exacerbated by the shift to remote learning because of the COVID-19
pandemic—the Department has decided to excerpt this PRIMER ON COMMUNICATION AND MEDIA RESEARCH from
the main Handbook. This Primer is designed as a stopgap measure that can be temporarily used in
communication and media research classes until the complete Handbook is published. As such, the Primer is
formatted as a compilation of class handouts and contains only the basic conceptual notes in the research
process. The formative exercises and summative applications, as well as the local researches used as
illustrative examples to explain the principles and procedures discussed in the Handbook, are not included in
this Primer.

iv
This Primer helps address the worsened paucity in learning resources during the pandemic. As we have many
other concerns regarding research during this extraordinary period, this Primer concludes with an importantly
hopeful write-up on how we can facilitate the Dawn of a New Era for communication and media research.

The Handbook and this Primer are projects of the faculty members of the UP CMC Department of
Communication Research, with support from the Enhanced Creative Work and Research Grant program of the
Office of the Vice President for Academic Affairs of the University of the Philippines.

Professor Fernando dlC Paragas, PhD


Primer Editor

v
ABOUT THE UP CMC
DEPARTMENT OF COMMUNICATION RESEARCH
Since its inception in 1975, the Department of Communication Research of the University of the Philippines
College of Mass Communication has offered courses designed to develop scholarship, skills-proficiency, and
professionalism among its students. The Department undertakes research and extension projects which
benefit stakeholders within and beyond the academe.

By teaching and conducting communication research within the context of processes and effects, and
grounding these with practical experience through fieldwork and internships, the Department ensures that its
graduates are equipped to contribute to the practice of social research in the academe, in the communication
and media industries, and in government and non-government sectors.

The Faculty is at the cutting-edge of communication research in the Philippines. Its members are experts in
basic and applied quantitative and qualitative research, as well as in the related areas of political
communication, health communication, social mobilization, strategic communication, advertising, and public
relations/information, among others. The Faculty also extends its services to assist developmental efforts by
local and international government and non-government organizations as well as business and industry groups.
Through their research, publications, and extension work, the Faculty thus contributes to the development of
its discipline in particular and to society as a whole.

The Academic Programs

The Department’s degree programs combine theories, methods, practice, and ethics in the teaching of
communication and media research. Through the programs’ various lecture and seminar courses, students
negotiate a diverse array of Philippine and global theories, appreciate the nuances of research methodology,
and study as well as critique public, corporate, and social marketing programs.

Communication Research students…


- Learn the nitty-gritty of quantitative and qualitative research in communication and media;
- Enjoy courses that combine classroom learning and hands-on, fieldwork experience; and,
- Get involved in various extra-curricular activities that make for a holistic academic experience.

The Department’s academic programs are as follows:

BA Communication Research. The program develops scholarship, skills, and proficiency among its students by
teaching and conducting communication research and by grounding these with practical experience through
fieldwork and internship. It ensures that its graduates are equipped to contribute to the practice of social
research in various sectors.

Master of Arts in Communication. The program contributes to a comprehensive and innovative advancement
of communication as an academic discipline and professional field of study. It seeks to develop critical inquiry
and high-level research by striking a balance between communication research theory and practice. It grounds
the fostering of awareness and responsibility in communication and its application in other disciplines.

PhD in Communication. The program provides advanced graduate training in theory, research, policy,
planning, and management which enables qualified students to carry out independent research in
communication and related disciplines and to pursue careers in academic, government, and private
communication media agencies and communication-related institutions. It offers a platform to attain
distinction in the field of communication for professionals in the communication discipline and related fields.

vi
Flagship programs

SALIK PACMRI NCRC CRIC

SALIK

Salik is the Department’s Research Hub through which it integrates its instruction, research, and extension
activities. Salik is the root word of Saliksik and Mananaliksik, the Filipino word for research and
researchers, respectively. Salik has three components:

Suri is the research component which comprises the Department’s two thematic research laboratories:
- Salaysay surfaces and determines Filipino communicative experience
- Subaybay examines Philippines media content and reception

Sanay pertains to the service and extension initiatives of the Department. It offers training programs in
communication and media research, strategic communication, and allied topics. Within Sanay, the
Department is also developing teaching materials such as a handbook on communication/media research
and case studies on Philippine communication/media.

Hanay is the Department’s digital repository initiative which consolidates studies and datasets on Filipino
and Philippine communication and media.

THE PHILIPPINE ASSOCIATION FOR COMMUNICATION AND MEDIA RESEARCH, INC.

The Department is the Secretariat of the Philippine Association for Communication and Media Research,
Inc. (PAMCRI), an organization of, for, and by scholars. PACMRI seeks to advance knowledge about
communication and media phenomena that involve Filipinos and the Philippines.

First discussed at the National Communication Research Conference (NCRC) in Baguio City in 2017,
PACMRI was launched at the NCRC 2018 at UP Diliman.

THE COMMUNICATION RESEARCH CONFERENCES

The Department holds two flagship communication research conferences which provide students and
their faculty mentors with a platform to present and discuss their research papers beyond the classroom.

The National Communication Research Conference (NCRC), which started in 2012, was the geographic
expansion of the Communication Research Student Conference which began in 2008.

In 2016 and 2019, the Department hosted the Communication Research International Conference (CRIC).

vii
THE AUTHORS

Associate Professor Julienne Thesa Y. Baldo-Cubelo, PhD

Dr. Baldo-Cubelo has a BA degree in Broadcast Communication (cum laude), an MA


degree in Women and Development Studies, and a PhD degree in Communication from
UP Diliman. She was an awardee of the UP College of Mass Communication (CMC)
Natatanging Guro (Junior Faculty) in 2017.

Assistant Professor Jon Benedik A. Bunquin, MA

Asst. Prof. Bunquin holds a BA degree in Journalism (cum laude) and an MA degree in
Communication from UP Diliman. He received the award for best master’s thesis in
Communication at UP CMC for his work on the Filipino youth’s political communication
networks.

Associate Professor Jonalou S.J. Labor, PhD

Dr. Labor completed his PhD degree in Communication at UP Diliman. He was awarded
Best PhD Dissertation at UP CMC for his work “Performance of Online Faces in Mobile
Dating Applications among Filipino Millennials.” He earned his master’s (with highest
honors) and undergraduate (cum laude) degrees in communication arts from UP Los
Baños.

Assistant Professor Ma. Aurora Lolita Liwag-Lomibao, MA

Asst. Prof. Lomibao completed her BA degree in Journalism and MA degree in Media
Studies and is on her way to earning her PhD degree in Communication at UP Diliman.
She has worked with UN Women and was the Executive Director of Kanlungan Centre
Foundation, a non-government organization which provides direct services to distressed
OFWs and trafficked women.

Professor Fernando dlC Paragas, PhD

Dr. Paragas earned his PhD degree in Communication from Ohio University, USA where
he was a Fulbright scholar. He has a master’s degree in Urban and Regional Planning
(Dean’s Medallion recipient) and a BA degree in Communication Research (best
undergraduate thesis awardee) from UP Diliman. He was a recipient of the 2019 Gawad
Tsanselor para sa Natatanging Guro ng UP Diliman.

viii
Professor Elena E. Pernia, PhD

Dr. Pernia completed her BA degree in Journalism, MA degree in Communication


Research, and PhD degree in Communication at UP Diliman. She was a Postdoctoral
Fellow in Communication at the Center for Communication Programs Bloomberg School
of Public Health of the Johns Hopkins University. Dr. Pernia was an awardee of the UP
College of Mass Communication Natatanging Guro (Senior Faculty) in 2017.

Associate Professor Ma. Rosel S. San Pascual, PhD

Dr. San Pascual has a BA degree in Communication Research, a master’s degree in


Development Economics, and a PhD degree in Communication from UP Diliman. She also
has an MA degree in Communications and New Media from the National University of
Singapore. She is the chair of the UP CMC Graduate Studies Department.

Assistant Professor Randy Jay C. Solis, PhD

Dr. Solis, the current chair of the Department of Communication Research, earned his
PhD in Communication from the School of Journalism and Communication of the
Chinese University of Hong Kong where he was awarded the Lion Dr Francis K Pan
Scholarship Award from 2018 to 2020 and served as the Editorial Assistant for the
Chinese Journal of Communication. He completed his undergraduate management
degree and Master of Communication degree at the Ateneo de Manila University.

Professor Violeda A. Umali, PhD

Dr. Umali has master’s degrees in demography from the UP Population Institute (UPPI)
and in mass communication from the Nanyang Technological University in Singapore.
She earned her PhD in Political Science from the University of Vienna. She served as
Director of the UP Diliman Research Dissemination and Utilization Office and the UP
Diliman Office for the Advancement of Teaching.

ix
TABLE OF CONTENTS
Part 1: Conceptualizing Research in Communication and Media
1. INTRODUCTION & REVIEW OF RELATED LITERATURE .................................................................................... 2
1.1. Thinking it through .............................................................................................................................. 2
1.1.1. Select a research topic .......................................................................................................... 2
1.1.2. Determine topic relevance .................................................................................................... 2
1.1.3. Review the literature ............................................................................................................. 3
1.1.4. State the research problem ................................................................................................... 3
1.2. Writing it down .................................................................................................................................... 3
1.2.1. Writing the introduction........................................................................................................ 3
1.2.2. Writing the review of related literature ................................................................................ 4

2. ETHICS IN RESEARCH ...................................................................................................................................... 5


2.1. General principles ................................................................................................................................ 5
2.1.1. Autonomy .............................................................................................................................. 5
2.1.2. Nonmaleficence ..................................................................................................................... 5
2.1.3. Beneficence ........................................................................................................................... 5
2.1.4. Justice .................................................................................................................................... 5
2.2. Ethical issues in communication and media research ......................................................................... 6
2.2.1. Ethical considerations in the use of existing materials ......................................................... 6
2.2.2. Ethical considerations in the collection of data .................................................................... 6

3. STUDY FRAMEWORK ...................................................................................................................................... 7


3.1. Selecting theories ................................................................................................................................ 7
3.2. Operationalizing................................................................................................................................... 7
3.2.1. Theoretical level .................................................................................................................... 8
3.2.2. Conceptual level .................................................................................................................... 8
3.2.3. Operational level ................................................................................................................... 8
3.3. Connecting theory ............................................................................................................................... 9
3.4. Doing the framework ........................................................................................................................... 9
3.5. Writing it .............................................................................................................................................. 9

4. CHOOSING APPROPRIATE OPERATIONAL DEFINITIONS ............................................................................... 10


4.1. Dimensions and indicators................................................................................................................. 10
4.2. Characteristics of good operational definitions ................................................................................. 10
4.3. Scales ................................................................................................................................................. 11
4.4. Validity and reliability in research ..................................................................................................... 11
4.4.1. Validity ................................................................................................................................. 11
4.4.2. Reliability ............................................................................................................................. 12
4.5. Research instrument .......................................................................................................................... 13

x
Part 2: Designing Research in Communication and Media
5. RESEARCH METHODOLOGY .........................................................................................................................16
5.1. Developing the research methodology ..............................................................................................16
5.2. Differentiating methodology from methods and
data-gathering or data-construction techniques ...............................................................................16
5.2.1. Methods ...............................................................................................................................16
5.2.2. Data-gathering or data-construction techniques .................................................................18
5.3. Writing the sections of the methodology ...........................................................................................19

6. CONTENT ANALYSIS .......................................................................................................................................21


6.1. Basic premises ....................................................................................................................................21
6.2. Basic components of content analysis................................................................................................22
6.3. Procedures for doing content analysis ...............................................................................................23

7. TEXTUAL ANALYSIS ........................................................................................................................................25


7.1. Basic premises ....................................................................................................................................25
7.2. Procedures for doing textual analysis.................................................................................................26

8. SURVEYS ........................................................................................................................................................29
8.1. Basic premises ....................................................................................................................................29
8.2. Procedures for doing surveys .............................................................................................................30

9. ETHNOGRAPHY ..............................................................................................................................................32
9.1. Basic premises ....................................................................................................................................32
9.2. Basic concepts ....................................................................................................................................33
9.3. Procedures for doing ethnography .....................................................................................................34

10. EXPERIMENTS ................................................................................................................................................37


10.1. Basic concepts ....................................................................................................................................37
10.2. The classic experiment .......................................................................................................................38
10.2.1. The basic components ..........................................................................................................38
10.2.2. Other requirements .............................................................................................................38
10.2.3. Conditions for causality ........................................................................................................39
10.3. Experimental designs ..........................................................................................................................39
10.4. Procedures for doing experiments .....................................................................................................40
10.4.1. Conceptualizing the experiment ..........................................................................................40
10.4.2. Planning your experiment ....................................................................................................40
10.4.3. Implementing your experiment ...........................................................................................42

11. RECEPTION ANALYSIS ....................................................................................................................................43


11.1. Basic concepts ....................................................................................................................................43
11.2. Procedures for doing reception analysis ............................................................................................45
11.2.1. Qualitative techniques in reception analysis .......................................................................46
11.2.2. Quantitative research techniques ........................................................................................51

xi
12. CASE STUDY .................................................................................................................................................. 53
12.1. Basic premises ................................................................................................................................... 53
12.1.1. Case study vs other methods .............................................................................................. 53
12.1.2. Definition ............................................................................................................................. 54
12.1.3. Characteristics of a case study ............................................................................................ 56
12.2. Procedures for doing case study........................................................................................................ 57

Part 3: Analyzing & Reporting Research in Communication and Media


13. THE RESEARCH DISSEMINATION PLAN ......................................................................................................... 64
Planning for research dissemination ............................................................................................................ 64

14. QUALITATIVE DATA ANALYSIS AND INTERPRETATION ................................................................................. 66


14.1. Overview ............................................................................................................................................ 66
14.1.1. Purpose of analysis in qualitative research ......................................................................... 66
14.1.2. The nature of data ............................................................................................................... 67
14.1.3. Source methods in qualitative data analysis ....................................................................... 68
14.1.4. Key principles in qualitative data analysis ........................................................................... 69
14.1.5. Key considerations in qualitative data analysis ................................................................... 69
14.2. The process of qualitative data analysis ............................................................................................ 72
14.2.1. Data management through data reduction ........................................................................ 73
14.2.2. Description as analysis: Analyzing for content .................................................................... 76
14.2.3. Interpretation as analysis: Analyzing for context ................................................................ 79
14.3. Qualitative research writing .............................................................................................................. 81
14.3.1. First-person perspective ...................................................................................................... 81
14.3.2. Positionality ......................................................................................................................... 82
14.3.3. “Thick description” and metaphors ..................................................................................... 82
14.3.4. Writing as drafts .................................................................................................................. 82
14.3.5. So what? .............................................................................................................................. 83
14.3.6. Other writing tips ................................................................................................................ 83
14.4. Computer software for qualitative data analysis .............................................................................. 85
14.4.1. What computers can do in aid of analysis ........................................................................... 85
14.4.2. What they cannot do ........................................................................................................... 86
14.4.3. Which software do I use? .................................................................................................... 87

15. QUANTITATIVE DATA ANALYSIS AND INTERPRETATION .............................................................................. 88


15.1. Overview ............................................................................................................................................ 88
15.1.1. Purpose of quantitative data analysis ................................................................................. 88
15.1.2. Nature and sources of data for quantitative data analysis ................................................. 88
15.1.3. Basic guiding principles ....................................................................................................... 89
15.2. Key concepts ...................................................................................................................................... 91
15.2.1. Descriptive statistics ............................................................................................................ 91
15.2.2. Inferential statistics ............................................................................................................. 95
15.2.3. The process of quantitative data analysis ........................................................................... 97
15.3. Interpreting Findings ....................................................................................................................... 125
15.4. Managing quantitative data ............................................................................................................ 126
15.4.1. Content analysis data ........................................................................................................ 126
15.4.2. Experiment data ................................................................................................................ 127
15.5. Writing quantitative research reports ............................................................................................. 127

xii
16. MIXED METHODS ANALYSIS ........................................................................................................................129
16.1. Overview ...........................................................................................................................................129
16.1.1. Review of methods .............................................................................................................129
16.1.2. Benefits and challenges of mixed analysis .........................................................................129
16.1.3. Considerations in doing mixed method analysis ................................................................130
16.2. The Analytical Process ......................................................................................................................131
16.2.1. Single-paradigmatic mixed methods ..................................................................................131
16.2.2. Multi-paradigmatic mixed methods ...................................................................................132
16.3. Interpretation principles for mixed method studies ........................................................................132
16.3.1. Reading across data............................................................................................................133
16.3.2. Linking to theory ................................................................................................................134
16.3.3. Providing implications ........................................................................................................135

17. RESEARCH REPORTING FOR ACADEMIC AUDIENCES ..................................................................................136


17.1. Overview ...........................................................................................................................................136
17.2. Understanding the types of research reports and their audiences ..................................................137
17.2.1. Academic audience 1: Teachers, panel members, and students .......................................138
17.2.2. Academic audience 2: Academic conference organizers, journal and book editors and
reviewers ..........................................................................................................................................139
17.2.3. Academic audience 3: External audiences .........................................................................139
17.3. Components of the research reports ...............................................................................................140
17.3.1. Abstract ..............................................................................................................................140
17.3.2. Introduction........................................................................................................................140
17.3.3. Review of Related Literature ..............................................................................................141
17.3.4. Study Framework ...............................................................................................................141
17.3.5. Methodology ......................................................................................................................141
17.3.6. Results and Discussion .......................................................................................................142
17.3.7. Summary and Conclusion ...................................................................................................142
17.3.8. Implications and Recommendations ..................................................................................142
17.3.9. Bibliography .......................................................................................................................143
17.4. Key considerations in writing the research reports ..........................................................................143
17.4.1. Focus ..................................................................................................................................143
17.4.2. Organization .......................................................................................................................144
17.4.3. Tone....................................................................................................................................144

18. POPULARIZING RESEARCH ...........................................................................................................................145


18.1. Overview ...........................................................................................................................................145
18.2. Understanding the audience ............................................................................................................146
18.2.1. Identifying audiences .........................................................................................................146
18.2.2. Analyzing audiences ...........................................................................................................147
18.2.3. Crafting the key message ...................................................................................................147
18.3. Developing materials for research popularization ...........................................................................149
18.4. Visualizing data .................................................................................................................................149
18.4.1. Creating charts ...................................................................................................................150
18.4.2. Refining your visualization .................................................................................................152
18.5. Creating presentations .....................................................................................................................153
18.5.1. Designing slide presentations ............................................................................................153
18.5.2. Delivering slide presentations ............................................................................................156

xiii
18.6. Designing poster presentations ....................................................................................................... 156
18.6.1. Deciding on the content .................................................................................................... 156
18.6.2. Laying out the poster elements ......................................................................................... 157
18.6.3. Writing research briefs ...................................................................................................... 158
18.7. Disseminating in non-traditional formats ........................................................................................ 159
18.7.1. Engaging audiences through social media ........................................................................ 159
18.7.2. Making research available through digital repositories .................................................... 159
18.7.3. Self-publishing through blogs and podcasts ...................................................................... 159

Epilogue
DOING RESEARCH IN THE POST-PANDEMIC ENVIRONMENT ............................................................................ 161

xiv
PART 1
Conceptualizing Research in
Communication and Media

A Primer on Communication and Media Research 1


1. INTRODUCTION & REVIEW OF RELATED LITERATURE
by Assistant Professor Randy Jay C. Solis, PhD

In doing research, the first step in the research process is often perceived to be the easiest stage
and may be downplayed even by the most seasoned of researchers. However, experience shows
that choosing a research problem may have to be the biggest source of anxiety for any student or
researcher, as the failure of a completed research may be rooted on a poorly planned beginning.

Most of the difficulty in identifying a research problem boils down to either “having no problem”
at all or the lack of focus due to the abundance of communication topics to choose from. Thus,
this section is meant to help current and future researchers in terms of addressing these
challenges in developing research topics into an organized research problem and objectives
(RPO).

The following procedures are meant to assure the researchers that there is nothing to fear,
because once this first step in research is done well, every other step in the research process will
fall into place.

1.1. Thinking it through

To get started, there are four key steps in identifying a topic and transforming it into a viable
research problem.

1.1.1. Select a research topic

What are the key steps in choosing which topic to pursue? Know where to start.

The formulation of the research problem and objectives begins with the selection of a research
topic or idea. Unfortunately, students and researchers find it difficult to come up with a research
idea simply because they do not know where to look for these research topics.

The following are helpful tips in terms of accessing inspiration from various sources of research
problems:
- Choose and concentrate on your research interests
- Decide on your research goals and approach
- Review your paradigms and theories
- Look at academic and trade publications as well as current events
- Connect to the internet for insights to your research ideas
- Reflect on everyday situations

1.1.2. Determine topic relevance

The next step after determining a list of potential research ideas or topics is testing whether
these ideas are worth pursuing as a research project.

Ask yourself the following questions:


- Is it worth the research effort?
- Is the topic too broad?
- Can the problem really be investigated?
- Can the data be analyzed?
- Is the problem significant?
- Can the results of the study be generalized?

2 © UP CMC Department of Communication Research


- What costs and time are involved in the analysis?
- Is the planned approach appropriate to the project?
- Is there any potential harm to the subjects?

1.1.3. Review the literature

The review of related literature, as an activity in the research process, is not only done at the
later stages of conceptualization or in the development of the research design. It is also essential
in developing a research topic and finalizing the RPO. A researcher must consult past studies to
look at what has already been done about a particular subject to determine whether a research
idea still has merit. Aside from indicating relevance, the answers to the following questions
(Wimmer and Dominick, 2006) also determine how you will choose and eventually state your
RPO, especially if there is a need to state hypotheses:
- What type of research has been done in the area?
- What has been found in previous studies?
- What suggestions do other researchers make for further study?
- What has not been investigated?
- How can the proposed study add to our knowledge of the area?
- What theories were used in related studies?
- What research methods were used in the previous studies?

1.1.4. State the research problem

After determining the general research idea and subjecting this idea to literature review, you, as
the researcher, must now be able to write it in a statement, which may take the following forms:
- Research Question—formally stated question or inquiry intended to provide indications
about a particular concern or issue
- Objectives—formal statements, a declarative form-translation of the research question,
which identify what we want to specifically find out about the general research question
- Hypothesis—formal statement proposing a relationship between two or more variables,
based on existing theory or past studies regarding the relationship between variables, and is
tested in a particular study; the predicted relationship is either true or false

1.2. Writing it down

Once you have thought through your research idea, it is now time to put paper to pen. After all,
research must first be written for it to be disseminated and shared widely. Only then can it
contribute to the greater body of knowledge.

Here are some tips on how to do this.

1.2.1. Writing the introduction

The introduction serves as an overview of the entire work. A good “eyes as a window to the soul”
and “first contender to make an impression” means that the introduction should be well-written,
convincing, clear, logical, and organized. If your introduction is poorly written, then its reader
may have a bad impression about your study, subsequently take the succeeding parts of your
paper as doubtful, or automatically dismiss your work altogether.

A Primer on Communication and Media Research 3


Research outputs depend on the nature of the publication. The length and look of a journal
article, for example, may vary from that of a full-blown thesis publication. Generally, however,
the following are the subsections of an introduction to a research work:
a. Background of the study
b. Rationale of the study
c. RPO
d. Scope and limitations
e. Significance of the study

The introduction does not need to be divided into sections with the aforementioned as headings.
These components may all be subsumed in one Introduction section, so long as they are all
articulated in the body. For the purpose of explaining the components individually, however, the
succeeding section explains each one as a distinct subsection in an introduction.

The following tips may help you in terms of writing a good introduction for your study:
- Write your introduction as if it were a road map.
- Define your concepts, terms, and jargons contextually.
- Write the introduction as if you are writing a compelling story or engaging in a debate.
- Use quotes and anecdotes in your introduction judiciously.
- Make your writing accessible and convincing.
- Follow a cycle of writing and rewriting.

1.2.2. Writing the review of related literature

What is the difference between the Introduction and the Review of Related Literature (RRL)?

The main difference between the two is that the Introduction must contain all content necessary
to give the background, key concepts, rationale, and problems of the study while the RRL
contains the summaries, critiques, and comparisons of related studies used to build one’s own
research. Thus, while contents from the RRL may be found in the introduction, these must be
written as succinct findings and summaries and need not provide the entire details of these
studies. Moreover, the introduction may contain contents (statistics, historical data, etc.) which
may have been culled from non-academic journal publications as well, and thus should not be
included in the review of studies in the RRL section. Strictly speaking, the RRL must contain
scientific research studies as published in journals, research anthologies, monographs, theses,
and dissertations. But this is not to say that the RRL is merely a review of one article after
another.

The goal of the RRL is to present emerging themes borne out of the critical evaluations and
comparisons across research articles. Because of this evaluative and comparative nature of the
RRL, it is normally much longer than the Introduction. Remember that the introduction aims to
present the main thesis of the study using background information; presenting a long review of
literature in the Introduction might derail this purpose. Because of these, the RRL is naturally
located after the Introduction.

4 © UP CMC Department of Communication Research


2. ETHICS IN RESEARCH
by Assistant Professor Ma. Aurora Lolita L. Lomibao, MA & Associate Professor Jonalou S.J.
Labor, PhD

Before we proceed with our study, we must consider a myriad of ethical standards that govern
the conduct of research. These guidelines can apply to specific fields of study, or on the selection
of topics, or on methodological decisions.

2.1. General principles

Some ethical guidelines apply to the study of specific communities as subjects of studies.
However, research ethics generally share four principles. Wimmer and Dominick (2011) identify
them as a) autonomy; b) nonmaleficence; c) beneficence; and d) justice.

2.1.1. Autonomy

Autonomy, also called self-determination, calls for the researcher to “always respect the rights,
values, and decisions of other people” (Wimmer & Dominick, 2011, p. 67). In research, this
means that people, as research subjects, have the ultimate right to decide who knows what
about them. The principle involves informing the participants about:
- All the pertinent details, including risks and benefits, about the study they will be involved in;
- Their voluntary involvement in the study, which means they can leave the research at any
point in the course of the study; and,
- That any information obtained about them as participants will only be used by the
researcher and only in certain ways.

2.1.2. Nonmaleficence

Nonmaleficence is based on the principle that “it is wrong to intentionally inflict harm on another”
(Wimmer & Dominick, 2011, p. 67). Researchers must always be aware of any potential threats
or disturbance that their research can cause to individuals or communities. Lee (1993) identifies
three possible threats to informants/respondents that researchers should always look out for.
These are intrusive threat, threat of sanction/retaliation, and political threat.

2.1.3. Beneficence

Related to the principle of nonmaleficence, beneficence “stipulates a positive obligation to


remove existing harms and to confer benefits on others” (Wimmer & Dominick, 2011, p. 67). This
means that researchers should always have the welfare of the research participant as a primary
consideration in the research. This requires ethical statements that maximize the benefits of the
research to the participants, their community, or to society, while also minimizing the harm that
it brings to the individual. Research that deals with marginalized sectors and groups, as well as
studies that delve into sensitive topics, often encounter issues of beneficence.

2.1.4. Justice

The general approach to the justice principle is that “people who are equal in relevant respects
should be treated equally” (Wimmer & Dominick, 2011, p. 67). The ethical principle of justice
emanates from the researcher’s respect for the participants. No researcher should take
advantage of any person or group, just to achieve the objectives of the study.

A Primer on Communication and Media Research 5


2.2. Ethical issues in communication and media research

Research requires the use of existing facts, gathered by other researchers, and the expertise of
other people. Both need to be acknowledged and valued. Anyone who wants to contribute to
any academic field should know that the production of knowledge and contribution to a field of
specialization is no easy task, so it is expected that researchers follow certain protocols and
procedures in using existing knowledge. A researcher is expected to be sensitive in dealing with
issues that concern ethical use of works in the research profession.

Communication and mass media researchers should think of two key considerations before
embarking on a research journey to ensure they are going to do their study ethically:
Firstly, they must know how to use existing works that would help in the conceptualization and
conduct of research. Secondly, they must understand and prepare to implement existing ethical
guides in the design, collection, analysis and interpretation of data.

2.2.1. Ethical considerations in the use of existing materials

Researchers like you are encouraged to gather their own data from participants they have
selected. But some studies make use of data that others have already collected, through surveys
and other research activities. You can use these existing data to generate new hypotheses, or to
come up with new analyses. This enables you to save on time, money, and other resources which
you can allocate for other aspects of your research. However, the use of data that other
researchers have produced also raises some ethical questions that you have to think about.
Some of these revolve around potential harm to individual subjects, and the issue of consent.

For details about these ethical considerations, read up on intellectual property, plagiarism, and
piracy.

2.2.2. Ethical considerations in the collection of data

When collecting data, be mindful of the following:

a. Privacy and confidentiality—Privacy is about the right of the people to be protected. This
means that individuals who participate in studies must be treated as autonomous and
should be given full respect for their information. There should also be protection from
embarrassment, stress, and other social harm. Confidentiality, on the other hand, involves
rules or agreements that limit the access to the data that a person shares to researchers.

b. Disclosure—Disclosure is necessary in establishing mutual relationship between the


researcher and the research participants. This refers to the nature and amount of the
information that a researcher is willing to divulge to the respondents of the study.

c. Conflicts of interests—Conflicts of interests in research occur “when the researchers


coexisting personal, financial, political, and academic interests and the potential exists for
one interest to be favored over another that has equal or even greater legitimacy in a way
that might make other reasonable people feel misled or deceived” (Israel & Hay, 2008, p.
112).

6 © UP CMC Department of Communication Research


3. STUDY FRAMEWORK
by Professor Fernando dlC Paragas, PhD

Developing a study framework is a multi-step process which begins with identifying the
appropriate theories and operationalizing these to apply to your project.

3.1. Selecting theories

Identifying the theory which best fits a study may sometimes be akin to looking for the proverbial
needle in a haystack. It is thus important for communication and media researchers to have a
good understanding of theories in our discipline in general as well as the specific theories which
we frequently use in our own projects. Here are some tips in understanding theories:
- Know the theorist behind the theory
- Know the theories’ original setting and its evolution
- Classify theories

The question remains, however: How does one choose a theory to inform one project? Here are
three diagnostic questions:
- Does the theory belong to the same paradigm as your research project?
- Are the premises of the theory aligned with those of your project?
- Does the theory most parsimoniously capture the arguments of your project? By parsimony
we mean the theory is simple yet comprehensive.

3.2. Operationalizing

Once you have identified the appropriate theory, you can now start to operationalize it. What
does this mean?

Operationalization is the process through which a theory, or an integration of theories, is applied


to a current research project. It can be too abstract an activity sometimes. Thus, let us make a
simple quick exercise. Imagine we want to study the factors that lead towards online gaming.
After searching for theories that can guide us in our study, we find the hypothetical A-ABC
Theory as the best fit for our project.

(By hypothetical theory we mean we are just conjuring it for illustration purposes. In actuality, as
we will show later in this section, we really do need to find a proper and well-established theory
to guide your research. Theories, as explained earlier, have already been confirmed by previous
research. We expand the knowledge base of our discipline through our research by incrementally
building upon theories. If we perpetually use atheoretical (or non-theory based) models then we
are always starting from scratch. In other words, it is as if we are always reinventing the wheel,
as the cliché goes.)

Now let us return to our hypothetical theory which supposedly argues that Control (C) is
determined by Attributes (A1), Attitude (A2), and Behavioral Intent (B). Given our topic and
hypothetical theory, we can then begin the operationalization process, which has the following
general steps.

A Primer on Communication and Media Research 7


3.2.1. Theoretical level

At this level, we discuss the theory that best informs the research project. Using the theorists’
original explanation and related literature (i.e., subsequent research by other scholars), we argue
for the compatibility between the theory’s and the current project’s key assertions, and concepts
and their inter-relationships. In illustrating this level, we use the concepts as defined and
presented in the theory. For example:

Figure 1. The A-ABC Theory

3.2.2. Conceptual level

At this level, we apply the theoretical concepts into our study. We cite studies which use the
theory to argue that our application is logical and valid. You can see in the following figure that
attitude, behavioral intent, and control have now been applied to our study by limiting their
discussion in terms spending time in online gaming. Attributes, meanwhile, have been translated
into demographic characteristics.

Figure 2. Conceptual Model

3.2.3. Operational level

At this level we specify how we are going to study the constructs. The idea of spending less time
has now been expressed in very measurable terms: in number of hours. Moreover, demographic
characteristics have been limited to three measures: Sex (male or female), Age (in years), and
Personal Monthly Income.

Figure 3. Operational Model

8 © UP CMC Department of Communication Research


3.3. Connecting theory

The Framework is the bridge that connects different sections of a study. Specifically, the
framework links to

a. The Research Objectives—The framework and the objectives must align to each other.
Specific boxes and arrows in the framework must have a corresponding objective. There
should be no boxes and arrows in the framework that are not in the objectives. Conversely,
there should be no objectives which have no corresponding concepts and their relationships
in the framework.

b. The Review of Related Literature—The choice of the theory for the Study Framework is also
guided by related literature. Thus, when reviewing previous research, do take note of what
theories have been used in studying the topic of the current project. This helps ensure a
good fit between the theory and the topic.

c. The Methodology—The paradigm (positivist or interpretivist) of the theory selected must


align with the methodology (quantitative or qualitative).

d. Data analysis and interpretation—Findings must be discussed relative to the arguments of


the chosen theory and its operationalization. In line with the deductive theory-driven
approach of Positivist studies, data must be discussed according to hypotheses in the
analytical framework Conversely, in line with the inductive data-grounded approach of the
Interpretivist paradigm, our goal in qualitative research is to construct a model based on our
findings.

3.4. Doing the framework

As you do the framework, be mindful of the following guideposts in the operationalization


process:
- Identify rigorously the theory which best informs the study.
- Respect the theory by explaining not only its main concepts but also nuances in its argument.
- Cite properly the theorists who originally developed and subsequently refined the theory.
- Use related literature in identifying the theory and explaining its use and operationalization.
- Ensure that each level is sufficiently more detailed than the previous.
- Support the hypotheses in the analytical framework with related literature.
- Check for coherence between the operational level of the framework and the eventual
variables and measures/concepts and indicators section of the research design.
- Link the data analysis and interpretation to the study framework.

3.5. Writing it

The operative word in writing the Study Framework is “explain.” The text must not simply
describe the theory but argue for its applicability to the project. This involves discussing the
original intent and use of the theory. Moreover, the text does not simply enumerate the changes
across levels. It must explain the logic and validity of such changes given the original propositions
of the theory and the objectives of the current research.

A Primer on Communication and Media Research 9


4. CHOOSING APPROPRIATE OPERATIONAL DEFINITIONS
by Assistant Professor Jon Benedik A. Bunquin, MA

From choosing the conceptual definition, your next task, or challenge, rather, is operationalizing
these concepts. This means moving from the abstract to the concrete or observable aspect of
your research.

Operationalizing concepts entails going through the process of identifying dimensions of such
concept and specifying indicators and corresponding measures per dimension. In qualitative
research, operationalization stops at the indicator level because the phenomenon being
examined in qualitative research cannot are not quantifiable. But in quantitative research,
operationalization entails specifying measures of the indicators, which means identifying
question types. In some cases, research scales are used to investigate the phenomenon
numerically.

4.1. Dimensions and indicators

The first step in operationalization is identifying dimensions of a concept. Dimensions refer to the
various facets of a concept, i.e., the classifications of various meanings given to a concept. These
are based on the conceptual definition emanating from the literature. For example, the concept
of well-being could be defined in terms of its a) physical dimension referring to individuals’
overall level of health, b) social dimension referring to the social support they receive, and c)
emotional dimension referring to their general sentiment and outlook towards their current
status in life.

Dimensions are further specified in terms of their concrete or observable manifestations. These
are called indicators. Much like how a concept is composed of multiple dimensions, a dimension
is further broken down into multiple indicators. Using the concept of well-being as an example,
the physical dimension could be indicated by a) individuals’ history of illnesses, 2) their body-
mass index, 3) activity level, and other indicators which signify that they are healthy and free of
sicknesses.

This makes identifying concepts, dimensions, and indicators a bit tricky. Using learnings from
literature, being guided by various theory, and setting the scope of your research can help you
specify how you treat the concepts in your study.

4.2. Characteristics of good operational definitions

Indicators take off from the operational definition of constructs/variables specified in the study.
This definition serves as the basis in forming instruments, and a good operational definition
contains the following characteristics:

- They must be stated empirically (i.e., observable). For quantitative research, they have to
be measurable. Good operational definitions are not abstract. They are concrete
manifestations of the constructs included in the research.

- Good operational definitions are replicable. This is also a function of the concreteness or
observability of concepts. Constructs defined operationally can be easily spotted and
identified by other researchers. This means researchers must endeavor to come up with
good indicators for highly theoretical constructs.

10 © UP CMC Department of Communication Research


- Good operational definitions are based on literature. Operational definitions must be
agreed upon by other scholars. This point highlights the importance of a rigorous review of
related literature prior to operationalization.

4.3. Scales

As noted earlier, qualitative operationalization stops at the indicator level, while quantitative
operationalization requires specifying measurement tools. One of these tools is scales.

In the natural sciences, scales usually refer to the equipment used to measure an object or an
event’s magnitude, such as weighing scales or a ruler. But because we usually explore abstract
constructs in the social sciences, we use questions or several indicators to examine phenomenon.

In some instances, a single question is enough to identify a construct or a variable, such as a


person’s marital status, or a person’s sex assigned at birth. But abstract constructs, such as
beauty, happiness, and political engagement, cannot be identified through a single question
because they are multidimensional and complex. Scales are best used to unravel underlying
attitudes or abstract notions (Borgatti, 1996). They are constructed from a SINGLE DIMENSION
(unidimensional) of a concept.

So how exactly do we develop scales? DeVellis (2003) recommends the following steps in scale
construction:
- Determine the variable intended to be measured clearly.
- Generate a pool of statements (an item pool) which can go into the scale.
- Choose the appropriate format for the scale measurement (e.g., Guttman scale, Likert scale,
or semantic differential scale) .
- Ask subject matter experts to evaluate your item pool.
- Consider adding items that measure response bias.
- Pre-test the scale.

4.4. Validity and reliability in research

Validity and reliability are twin concepts that researchers work hard to achieve. In this section,
we discuss these two concepts, and the ways through which we can achieve these two in both
quantitative and qualitative research studies.

4.4.1. Validity

In quantitative research, validity refers to the extent to which a measure accurately describes a
construct. Constructs have to be isomorphic–they are the closest to reality. In simple terms,
validity means that we are measuring what we say we are measuring.

Validity in quantitative research is tested in four levels:

a. Face validity, which refers to the most obvious and common sensical way of measuring or
representing variables in a study

b. Content validity, which refers to the comprehensiveness of a measure in covering all aspects
and range of meanings of a concept

A Primer on Communication and Media Research 11


c. Criterion validity, which means that a construct is tested against an external criterion, which
could be another variable it is associated with (concurrent validity), or a variable it predicts
(predictive validity)

d. Construct validity, which refers to the theoretical relationship of the variable with other
variables

In qualitative research, validity refers to “appropriateness” of the study. This includes the
instrument and analytical tools used, the data construction processes undertaken, and the data
produced in the research. All of these are considered relative to the study’s context. To achieve
validity in qualitative, two criteria are examined: credibility and transferability:

a. Credibility refers to the believability and trustworthiness of the research. In qualitative


research, richness and thickness of insights is of utmost priority, instead of the amount of
data collected. Participants decide where the findings are reflective of the phenomena
examined.

b. Transferability refers to the applicability of the findings of the research in other contexts.
Qualitative researchers, then, must ensure that they provide enough information that can
aid the readers in applying the findings to other contexts.

4.4.2. Reliability

Reliability in research refers to consistency and dependability in terms of measurement and


findings. However, reliability for quantitative researchers differs from how qualitative
researchers examine it.

We have three methods to ensure that the scales and measurement tools we develop are
reliable:

a. Test-retest method, which measures the stability of a scale or the consistency of a scale
when measured at different points in time. A scale is considered as reliable when it yields
the same responses after re-testing, assuming the absence of an intervention or stimuli.

b. Split-half method, which measures the internal consistency of a measure

c. Cross-test method, which measures equivalency of items. This is done by developing two
scales from one construct and examining the correlation between the two scales to establish
their equivalency.

Unlike in quantitative research, reliability in qualitative research is not established statistically.


Instead, we use the concept of dependability as an alternative to reliability, and it also relates to
the consistency and replicability of the research from data construction, analysis, interpretation
to reporting. Dependability recognizes the uniqueness of context, and accounts for contextual
factors in the research process to ensure that researchers who intend to replicate the study will
be able to do so.

12 © UP CMC Department of Communication Research


4.5. Research instrument

The concepts we have discussed so far culminate in the research instrument, or the tool used by
researchers to investigate a communication phenomenon. Here are two key questions to
answer.

What are the key considerations in developing a research instrument?


- The research objectives
- The respondents
- The articulation of the questions
- The length of the instrument

What are the steps in developing the instrument?


- Determine the information to be sought
- Decide what types of research instrument is most appropriate given the study’s problem and
objectives
- Decide the items that would be used
- Develop question wording
- Layout and order the questions
- Pre-test the instrument, including the protocols of data collection
- Edit the research instrument and finalize procedures for use.

Instruments differ for each method. The next section discusses this in detail.

A Primer on Communication and Media Research 13


14 © UP CMC Department of Communication Research
PART 2
Designing Research in
Communication and Media

A Primer on Communication and Media Research 15


5. RESEARCH METHODOLOGY
by Professor Fernando dlC Paragas, PhD

What is the Research Methodology? Literally, it means the study of methods. It may seem like it
is just a grand word for data-gathering, but it is more than that. Our methodology guides and
explains why and how we collect or construct the data to address our research problem.
Moreover, our methodology is the practical translation of our study framework.

5.1. Developing the research methodology

In developing and writing the Research Methodology, we simply do not describe how data are to
be collected or constructed. Instead, we strive to be explanatory and analytical. This means we
- Use the related literature to underscore our decisions and choices in each section of the
methodology, and
- Align our study framework to a) our variables and measures or concepts and indicators and b)
our research instruments

Moreover, it is helpful to ask the following questions repeatedly as you develop your
methodology:
- What does the literature say about your proposed methodology? How has the methodology
been previously used in studies like yours?
- How differently is your topic from that of a previous study that uses the methodology you
now want to employ in your own work?
- How closely aligned are the variables/concepts in your framework to those in your
methodology?
- What scales have been used in previous studies that may be applicable to your own research?

5.2. Differentiating methodology from methods and data-gathering and data-construction techniques

Let us turn our attention to the research methods which we use to gather data. We can
categorize these methods according to the paradigm they abide by and the topics which they
study. Moreover, we classify methods as quantitative or qualitative according to the data they
produce.

5.2.1. Methods

As Figure 1 indicates, we create four quadrants when we intersect paradigms with our two
general categories of topics. Studies which subscribe to the Positivist paradigm typically employ
quantitative methods whereas those which abide by the Interpretivist paradigm usually use
qualitative methods. Some methods are clearly categorized within each quadrant. However, do
note that the lines between these quadrants are dashed instead of being solid. This indicates that
the lines are porous since some methods and their data collection techniques can be located
within, between, or across Positivism and Interpretivism.

16 © UP CMC Department of Communication Research


Figure 1. Research methods

Let us now look at Figure 1 closely and focus on the top row which covers our two approaches to
the study of messages. From conversations to speeches, from group exchanges to organizational
communications, and from content in the mass media to content on the internet and social
media, messages are what we send and receive, disseminate and exchange, and encode or
decode. If we approach messages in the Positivist paradigm, then we conduct quantitative
Content Analysis. If we approach these in the Interpretivist paradigm, then we do qualitative
Textual Analysis. Do note there are many other types of message analyses, but we are only
covering content and textual analysis approaches in this primer as these are more appropriate
for new researchers like you. Other methods such as discourse analysis, rhetorical analysis, or
linguistic analysis require a certain level of practice and maturity among researchers.

Now, let us concentrate on the lower row of the figure which refers to the individuals, groups, or
organizations that produce, distribute, receive, process, or exchange messages. If these sources
and receivers are our topic, then we can study them using any of the three methods covered in
this manuscript. Within the Positivist paradigm, we have Survey and Experiment. In the
Interpretivist paradigm, we have Ethnography.

We have one method, Reception Analysis, the approach of which can be Positivist or
Interpretivist depending upon its study framework and the application of data collection
techniques. Reception analysis can also be mixed-paradigmatic if it uses qualitative and
quantitative approaches to the study of how audiences receive or make sense of messages. You
may note there is a line between Reception Analysis and Experiment. Some studies on audience
reception use Experiment procedures by showing research participants a stimulus and
subsequently getting their insights about it.

Finally, we have Case Study which, by its nature, is mixed-paradigmatic as it employs quantitative
and qualitative data. Researchers either collect these data primarily through their own field work
or draw and analyze them secondarily from existing databases. There are many definitions of
cases, but we are choosing one specific approach in this Primer, as the corresponding section
explains.

A Primer on Communication and Media Research 17


5.2.2. Data-gathering or data-construction techniques

Now that we are clear with paradigms and methods, let us look at the actual techniques we do to
collect data. As the ovals shapes in Figure 2 show, we have five main data collection techniques
in communication and media research. You will also see in Figure 2 that these techniques are
located along our Positivist-Interpretivist spectrum since they can be used in either quantitative
or qualitative research methods. It is in their application--whether theory-driven, data-grounded,
or a combination of both—that we classify them as being quantitative or qualitative.

When deployed in Positivist-Quantitative Studies, interviews and coding are labelled as data-
gathering techniques because of the objective conceptualization of the reality from which data
emanate. Conversely, when deployed in Interpretivist-Qualitative Studies, interviews and coding
are referred to as data construction techniques because of the subjective process of inductively
building the study’s dataset.

Other resources and researchers categorize some items listed here (e.g., interviews, FGDs, and
observation) as methods. However, we argue that methods have clear paradigmatic foundations
and protocols as discussed earlier. Moreover, methods employ procedures, which, in this Primer,
refers to data collection techniques.

Figure 2. Techniques for gathering or constructing data

18 © UP CMC Department of Communication Research


5.3. Writing the sections of the methodology

The methodology comprises not just of methods but the philosophy behind the conduct of the
research itself. Thus, it typically contains the following sections:

a. Research design—This answers a) the study’s paradigm (whether positivist, interpretivist, or


multi-paradigmatic), b) goals (whether exploratory, descriptive, or explanatory) and c)
temporal dimension (whether one-shot or longitudinal)

b. Methods—This explains which quantitative or qualitative approaches are being used with
their corresponding data-gathering techniques

c. Units of analysis—This refers to the basic elements from which the data from a study
emanate. In studies about communication sources and receivers, individuals are typically the
units of analysis. However, groups or organizations can also serve as units of analysis.

In research about messages, data come from recorded or archived material. However, the
units of analysis will differ from study to study. One study may look at all news articles, while
another may focus on only headline stories. One study may look at only text, while another
may look at images only.

d. Sampling—This is the process of selecting elements (or units of analysis) from a defined
population.

For sampling in Positivist studies, the central goal is randomization, which means every item
in the population has an equal chance of being included in the sample. Randomization
abides by and ensures objectivity. Randomization, together with the corresponding sample
size, determine the confidence level for the findings of a given study. By confidence level we
mean the extent to which we will arrive at the same findings within a particular margin of
error if we repeat a study so many times.

For Interpretivist studies, meanwhile, the guiding principle is purposefulness. It means we


select informants which demonstrate expertise, provide rich insights, share compelling
stories, and exemplify pivotal characteristics as regards our topic. After all, the goal of
qualitative research is to provide depth and nuance.

e. Variables and measures/Concepts and indicators—These are based on the study framework.
Variables and measures refer to the conceptual and operational levels in the study
framework of a Positivist study whereas concepts and indicators employ the theoretical and
conceptual levels of an Interpretivist study.

Research instruments explain the choice, structure, and content of the tools which we use to
gather data for the study.

For quantitative methods such as surveys and content analyses, these instruments include
the structured questionnaire and the corresponding codebook. Data gathering techniques
such as focus groups and focus interviews, when informed by the Interpretivist paradigm,
require semi-structured guides.

A Primer on Communication and Media Research 19


In discussing the instrument, we explain the rational of each section according to the
objectives and the supporting literature. It is important that we cite the source material for
any scale that we use in the study. We should also explain if we have revised the scale to
tailor it to our own study.

We also report in this section if we did any pretests of our instruments before we
implemented them. Specifically, we compare the original and revised instruments to
demonstrate how we used pre-testing insights to improve our instruments.

f. Data gathering activities and procedures—This section discusses the specific protocols on
how data are to be collected or constructed. In this section we also include our budget and
timeline. For the budget, we must cite our funding sources, responsibilities, and
accountabilities for transparency purposes. Meanwhile, for the timeline, we may use a Gantt
chart to depict our different activities.

g. Data analysis—This explains the procedures for processing, analyzing, and interpreting data.

For quantitative methods, we detail how the entries in accomplished instruments are
reviewed for completeness, legibility, comprehensibility, consistency, uniformity, and
inappropriate responses. We also explain how the entries are to be encoded using the
codebook Statistical tests to address the hypotheses in the analytical framework are also
explained. This includes exploratory and confirmatory tests.

For qualitative methods, we discuss how data are to be transcribed and then organized into
matrices. We then explain the types and levels of repeated or iterative reading that we are
going to do to surface patterns from the data. We then underscore how we will theorize, or
build a model, from such patterns.

20 © UP CMC Department of Communication Research


6. CONTENT ANALYSIS
by Professor Fernando dlC Paragas, PhD and Professor Elena E. Pernia, PhD

Under the umbrella term of message analysis, we can explore the breadth and depth of what
gets printed in newspapers and magazines, broadcasted on radio and television, shown in movies
and advertisements, or posted and shared on social media, among many others. Within message
analysis, there are two main approaches: content analysis and textual analysis. Content analysis
subscribes to the positivist paradigm and follows a quantitative approach to messages. Textual
analysis, meanwhile, subscribes to the interpretivist paradigm and follows a qualitative approach
to messages.

Content analysis is a powerful method to capture the messages embedded in recorded


communication materials. Through its theory-driven quantitative approach, it helps us examine
otherwise voluminous content using a manageable sample which is derived using such protocols
as the probability-based constructed week. Probability sampling helps us generalize our findings
based on our sample to the bigger content discourse: data from two constructed weeks, for
example, can inform us about a whole year’s content.

The theory-driven approach specifies the variables and measures which we use to code
seemingly continuous messages. We translate these variables and measures into a two-part
instrument comprised of the content analysis form and code guide. With iterative training that
ensures intercoder reliability, we are assured that we are coding as another person would code
the same message. We are therefore abiding by the ideals of positivist research for objectivity
and replicability.

6.1. Basic premises

• Objective

Content analysis subscribes to the positivist paradigm, which means that the characteristics
of messages in communication materials can be studied in a systematic and objective
manner. Hence, in line with positivism, it can be reasonably accepted that the results of the
content analysis of the same TV news program by different analysts or coders will result in
similar findings. That is because the coders use the same content analysis instrument and
have undergone some prior training on definitions and measures of the content analysis
variables. It is the coders who record observations about the messages embedded in the
communication material into the content analysis form.

• Theory-driven

The conduct of content analysis is deductive in nature. A study framework, constructed from
a theory or a set of theories, contains the nature and definition of the variables and
measures in the study.

• Quantitative

Content analysis focuses on the manifest elements of a message. Quantitative in its


approach, content analysis determines the frequency or extent to which these manifest
elements repeat or recur in a specific timeframe. Critics of content analysis say this makes
the study of communication materials shallow or simplistic, but this also allows the coding of
voluminous materials, including those which span a significant time period.

A Primer on Communication and Media Research 21


• Replicable

As the conduct of content analysis is informed by the principles of the classical scientific
process, it can be replicated with the expectation that the findings will be similar within a
certain margin of error. Replicability means the procedures are very detailed such that
another researcher can undertake the same project again and yield approximately the same
results. The subsequent research then helps confirm the findings of the earlier study.

• Generalizable

Content analysis abides by probability principles and procedures. Accordingly, findings from
a content analysis conducted on a sample of messages are generalizable to the universe of
messages from which the sample was drawn.

6.2. Basic components of content analysis

Three items specific to content analysis differentiate it from other message analysis methods:

• Content

Content, in content analysis, refers to messages as conceptualized in the positivist paradigm.


This means content has an objective definition and it can be characterized in terms of
variables and measures that are specified in the study framework.

This content is embedded in communication materials in print, audio, audio-visual, and


electronic formats. It also pertains to communication material used in various
communication levels (i.e., interpersonal, organizational, mass communication) and
platforms (i.e., folk media, electronic media, interactive media).

• Constructed week

This sampling scheme has been devised by communication scholars because most content
analyses look at materials with a certain periodicity.

The regularity in narrative pattern within and across the news programs allows for the
generalizability of findings between a content analysis sample chosen through a probability
constructed week scheme and a census of all episodes. In the constructed week sampling
scheme, the researcher randomly chooses one Monday, one Tuesday, one Wednesday, one
Thursday, one Friday, one Saturday, and one Sunday to comprise one week. This collection
of randomly drawn days becomes our constructed week. According to research, two
constructed weeks are sufficient to make inferences for a year of periodical or regular
content as discussed in the earlier paragraph. Research (e.g., Riffe, Aust, and Lacy 1993)
shows that findings from two constructed weeks have better generalizability than data from
two chronological or calendar weeks (meaning from Monday to Sunday).

• Intercoder reliability

This refers to the consistency with which different coders/content analysts would
independently but similarly code the same item.

22 © UP CMC Department of Communication Research


6.3. Procedures for doing content analysis

How do we undertake content analysis? Here are the basic steps:

• Start with the basic conceptualization process

As with other quantitative methods, doing a content analysis begins with the basic steps of
introducing the study and its significance, the statement of problem and objectives, the
review of the related literature, and the construction of the study framework.

• Identify the units of analysis

Units of analysis are where you get your data based on your research question. In content
analysis, the unit of analysis can be the whole material or a specific item in that material.

• Identify variables and measures


In your study framework, you operationalize your concepts into variables and measures (see
Part 1 for details). Variables are attributes of your units of analysis whereas measures are
the parameters of these attributes.

• Develop, pretest, and revise the instrument

Coding is the process of recording our observations about the communication materials. It is
the process through which pre-specified measures are entered for each variable. The
content analysis form and the code guide contain the variables and measures of the study.
The content analysis form is where we enter the data whereas the code guide tells us how to
do this.

• Select a representative sample

As we have discussed earlier, the primary method for sampling in content analysis is
constructed week, which involves randomly drawing dates for each day of the week to
develop a non-chronological, non-calendar week. Two constructed weeks, according to
research, are enough to generalize for a whole year’s coverage for repetitive content in
periodicized formats such as newspapers.

• Determine the database

Once you have defined your units of analysis, the next step is to create the database from
which you will get your units of analysis. This is a make-or-break part of your research since
content analysis requires a good database for it to work. An incomplete database does not
permit a proper probability sample. As a result, your findings will not be generalizable,
thereby defeating one major purpose of doing a quantitative project such as a content
analysis.

• Test for intercoder reliability

To code a material the same way as others would code it. That is the maxim behind
intercoder reliability, as we have discussed earlier. It ensures the objective reading and
coding of the message at hand. There are online resources and calculators that can guide
you in the conduct of intercoder reliability testing and computing reliability coefficients.

A Primer on Communication and Media Research 23


The general idea is you have to train your coders in coding your content using your code
guide and coding instrument. After the training, you then separate from each other and
code sample artefacts. You then calculate your intercoder reliability score (perhaps using an
online calculator designed for this specific task). If your score is 0.70 or higher, it means you
and your coders can now code your respective sets of material, with the premise that you
are generally coding these “objectively.” If your score does not meet this threshold, it means
you have to undertake the training anew, code another set of materials, and test your scores
again.

• Implement your project

Once your coding team has achieved the ideal intercoder reliability score, then you can code
the assigned items assigned to each of you separately. When all members have
accomplished their content analysis forms, then it is time to encode this into a database.
Content analysis follows the protocols for encoding, analysis, and interpretation for
quantitative research.

24 © UP CMC Department of Communication Research


7. TEXTUAL ANALYSIS
by Associate Professor Julienne Thesa Y. Baldo-Cubelo, PhD

Textual analysis is a method in research used in interpreting “texts” in order to find the possible
meanings of these texts based on what interpretations the researcher, or the textual analyst,
uses. Because it is a form of interpretation, textual analysis is about making educated guesses of
the many possible ways a text maybe interpreted. We can translate this act of doing textual
analysis as a form of “sense-making.”

Textual analysis is a research method fit for research questions on the meanings of “texts.” It is
the best method to use to surface informed estimations of a particular context’s set of values,
paradigms, motivations, and prospects for the future. Although it does not attempt to establish
causality among variables, it nonetheless allows for the surfacing of ideas and ideologies by
providing an account of the presence of certain abstract concepts in human phenomena.

Text in the context of communication and media research has to be the more tangible subjects of
analysis. The most reliable form of text, however, is anything that is crafted by somebody (we call
them “producers,” “creators,” or “makers” of content who are either individuals, groups of
people, institutions, or companies).

7.1. Basic premises

• Texts as artefacts of cultures

The text in textual analysis is called “text”—not just “films,” “TV programs,” or “books”—
because it is not just content made by its producers (writers, directors, authors, etc.).
Instead, the text is assumed to be reflections of the culture, or the context, in which it is
embedded. Any single artefact, object, act, or phenomenon from a particular culture is
argued to reflect certain aspects of this culture.

• Cultures as both heterogenous and homogeneous units

“Culture,” therefore, is seen not just as a homogenous entity, but also as a heterogenous or
“polytheistic” unit composed of varying sensibilities with overlapping and sometimes
contradicting tendencies. “Polytheism” technically means the belief in many gods, but here,
we adapt it to mean to be these unique attributes of cultures—variety and diversity.

• Sensemaking as a central activity in textual analysis

Sensemaking is the act of arriving at meanings and literally means “making sense” of what is
presented to the senses—estimating, evaluating, feeling, tinkering, “getting to know,” and
familiarizing what may at first appear to be mysterious, meaningless, unimportant, or even
senseless. Sensemaking is forwarding the meaning that the analyst has subjectively arrived
at.

Sensemaking in textual analysis is also termed as “critiquing” due to its usual way of
considering something as being “problematic.” Therefore, in higher-level textual analyses,
the term “problematization” often emerges. Researchers critique, not just criticize, a text
when they go beyond the exposition of the good and the bad of it.

A Primer on Communication and Media Research 25


• Texts as evidences of cultures

“Texts” are evidences of the existence of “cultures.” Photos of food on Instagram are
evidence of the “culture of cuisines” in certain regions in the country or of an upcoming
“culture of street food courts,” or a “culture of food photography.”

• Context, context, context

In the sensemaking of texts, context reigns supreme. Context is the place or the “universe”
where something is lodged in. The text has context and the reader of the text also has
context. Where the text is situated presents us many other levels or layers of context. It is
important to note that context may be considered a bit differently by another method,
ethnography. In ethnography, “context” is taken to mean the comprehensive or complete
background of a social group. On the other hand, but not in total contradiction, context in
textual analysis can both mean as the comprehensive background of a text and it can also be
the more specific “universe” found under a larger universe where it is lodged.

• Subjectivity as key in qualitative research

Sensemaking not just in textual analysis, but in most of qualitative research, has subjectivity
at its core. The focus of subjectivity is on the position of the knower (or the analyst,
researcher, sense-maker, or interpreter) as an important participant in knowledge
production. This means that where the knowers are coming from (or their specific contexts)
makes their subjectivity unique to them. Subjectivity values the unique position that analysts
hold in relation to how they generate meaning. One’s context is the starting place of this
subjectivity.

7.2. Procedures for doing textual analysis

How do we undertake content analysis? Here are the basic steps:

• Start with a “focus” as warm-up for conceptualization

Start somewhere. Start from something you may be most familiar with at this point, or in
something you spend a lot of time on.

• Formulate a tentative research question

Formulating a tentative research question can be like choosing a topic and narrowing the
focus of this topic. This is still part of a continuing process of conceptualization. Everything
may seem sketchy at this point. Please note that central to conceptualization is reading
related literature even if you just second guess what seems to be “related literature.”

• Review related literature and study theories that may guide your study

This part of the research conceptualization goes hand in hand with formulating the research
question and deepening the articulation on why a researcher’s interest in a topic is worthy
to be pursued.

26 © UP CMC Department of Communication Research


• Select a “text” from an artefact

What does choosing a text mean in textual analysis? What is the array of choices for this
research method? Selecting a text may first require two things: collecting several kinds of
this text (if your “text” is YouTube make-up video tutorial, for instance, you may pin several
of these from either one YouTuber or several); or surveying the presence of make-up
tutorials across platforms or across applications (Facebook, Pinterest, YouTube, Instagram,
etc.). This scanning of the field is informed by your personal experience and from your
review of related literature. To select is to decide which one of what kind (is it the YouTube
videos only and how many of these?).

• Select a unit of analysis

Once the choice of artefact is clear, you can now choose what particular aspect of the
artefact should be chosen to be interpreted. Somehow this was already done in the previous
section—say, the choice of teleserye brought you into the decision that only scenes where
there are male and female talking in them will be considered. It is very important though
that this very crucial act of choosing and deciding is connected to how the choice of unit of
analysis answers the research question.

• Identify concepts and indicators

The identification of the study’s concepts and indicators for qualitative research is the
equivalent of the identification of variables and measures in quantitative research. Since
concepts are abstracts, they cannot be observed in tangible form in the world.

• Analyze the text

This is the most fun part. The required objectivity here is two-fold: do not make a priori
assumptions (meaning, pre-empting your data) on your text. However, since you are also
allowed to bring out your subjectivity, you are expected to be clear with your lenses (where
are you “coming from” and to what paradigm do you adhere yourself?). “Lens” is also called
“stance.” It can be compared to a pair of sunglasses through which we see the world. If the
sunglasses were to have red lenses, the world would seem red in color. However, even if
your lenses are disclosed to your potential reader and more importantly, is cleared with
yourself, there should remain a level of informed innocence as you read your text. This is
about letting the text speak to you regardless of your knowledge of it.

• Describe the text

Before we jump into interpretation, we are first required to describe the text. It is necessary
to note here that many research textbooks consider description as part of analysis because
the choice of words is never value-free. The description of “tone,” for instance, can have
different possibilities. Description, therefore, is unavoidable not only because it concretizes
data into something that is already held and handled by the analyst, but also because it is a
think-out-loud exercise that further familiarizes the researcher with the text. If one can
describe something, there is evidence that something does exist.

A Primer on Communication and Media Research 27


• Interpret the text

The interpretation of the rhetorical context asks the following questions:


a. Who is the writer/speaker/performer?
b. What is her or his role or position?
c. Who is the intended audience?
d. What is the exigence which prompted this writer to write?
e. What discipline or discourse community does this text seem to be a part of?

Likewise, there is the interpretation of textual features which asks the following:
a. What issue is being addressed?
b. What position does the writer take?
c. What is the author's major claim or thesis?
d. Is the claim qualified (does the author hedge)? If so, how?
e. What evidence or reasons does the author supply to support the claim?
f. How good are these reasons or evidence?
g. Why do you trust or distrust the claims and evidence?
h. Does the author offer any refutations? If so, of what?
i. How effective are the refutations? What makes the persuasive or unpersuasive?

The other major category of interpretation is extrinsic interpretation. It is a kind of


interpretation which places the text in new contexts or other external phenomena. This is
bringing out the text into the universe of values that the analyst would like us to see. In a
way, the analyst convinces the potential readers that the text is worth examining. This is also
short of saying, “I see it this way because this is the philosophy I believe in, this is the
ideology I adhere to, the values I consider important, and the cultural norm I am
highlighting.”

• Review interpretations

This last step makes sure that analysts stay within the framework of the study and check
whether the research question is being answered. Since textual analysis connects the data
gathering method to analysis, researchers are expected to be extra careful with their
accounting for data. Are their interpretations substantiated by detailed descriptions? Are
their interpretations still within the bounds of their disclosed lenses or paradigms? Do the
values in their individual interpretations forward the good of the people and do not trample
on particular individuals or groups of people? How does this text relate to other texts the
researcher has been reading? How might another writer or researcher use the
interpretations presented here?

This last step is often not reflected in actual research write-ups. What we see are the final
interpretations. You have to remember, therefore, that these final sets of interpretations
have gone through a lot of review. In many instances, students are asked to defend their
final thesis so that interpretations can be scrutinized and can be guided further.

28 © UP CMC Department of Communication Research


8. SURVEYS
by Associate Professor Ma. Rosel S. San Pascual, PhD

Survey is one of the most popular social science research methods. It allows social science
researchers to ask an assortment of questions using a variety of formats for a wide range of
concepts and variables. The content and form of a survey may be customized to address a given
study’s problems and objectives, to match the level of literacy of the target population, and to
maximize response rate.

As a quantitative research method, surveys are geared towards objectivity, thereby asking a
standard battery of questions to a set of sampled respondents. Surveys inquire about the
incidence of certain variables across a set of sampled respondents and surveys count the number
of times these incidences occur. Depending on the sampling design, survey results may be used
to describe the sample or to make inferences about the population that the sample purports to
represent.

Surveys enable social science researchers to efficiently gather a huge amount of data across a
defined sample. They also enable researchers to gather data over a single period of time (i.e.,
cross-sectional survey) or across time (i.e., longitudinal survey). With the advent of online
technology, surveys may also be conducted across geographic space.

8.1. Basic premises

• Application

Surveys gather information from individuals through their responses to a standard


questionnaire. Surveys that involve asking all cases in the target population are referred to
as census. The amount of resources required for conducting a census depends on the
breadth and spread of the target population. However, with wider access to online
technology and greater digital literacy, administering a census online may not be as
resource-heavy even when the target population is geographically spread out.

• Focused

Surveys are typically designed to address a study’s defined set of research problems and
objectives. However, not all surveys are precisely designed to target a defined list of
research problems and objectives. Omnibus surveys cover an assortment of questions that
inquire on a variety of topics that would potentially interest different researchers (Schutt,
2001). Moreover, a survey may be designed as a cross-sectional study, wherein time is held
constant and data gathering is conducted among a range of individuals within a single time-
period.

• Goals

Surveys are typically employed when the research goal is descriptive or evaluative.
Descriptive research uncovers “what is going on or what exists” (Pernia, 2004, p. 23) while
evaluative research assesses “whether an intervention has achieved its objectives, and what
combination of factors or variables is most effective in achieving desired outcomes” (Pernia,
2004, p. 24).

A Primer on Communication and Media Research 29


Survey results per se do not establish nomothetic causal explanation (an explanation that
changes in the independent variable are consequently followed by changes in the dependent
variable, while holding other variables constant), which is the goal of explanatory research.
However, such results may help strengthen the evidence for causality by providing
descriptions of statistical associations or correlations.

8.2. Procedures for doing surveys

Survey projects that entail primary data gathering involve a meticulous process of research
conceptualization, design, and implementation. As a researcher embarking on a survey project,
you must be adept at conceptualizing, designing, and implementing your communication or
media research.

• Conceptualize your survey project

Every research project starts with reading a broad range of materials and your survey project
conceptualization also takes off from reading a wide variety of references to facilitate your
choice of a survey research topic. Once you finalize your choice of topic, you then read
related references to help you in articulating your survey research question, problems, and
objectives as well as in defining the concepts that your survey research intends to cover.

• Choose your survey form and tool

Consider the strengths and weaknesses of various survey forms in your choice of a particular
survey. Depending on your chosen survey form, you may also choose the specific tools that
you will employ in recording responses and each tool would have its own set of advantages
and disadvantages.

• Develop your survey questionnaire.

Survey questionnaires mediate between you (the researcher) and your respondents. On the
one hand, you have a set of questions to ask that addresses your study’s problems and
objectives. On the other hand, your respondents have the answers to the set of questions
that you want answered. Survey questionnaires, therefore, serve two important functions:
- as a measuring device that converts the measures of the concepts that you want to
study into questions and response options, and
- as a communication device that articulates the measures of the concepts that you want
to study through a language and form that your respondents would clearly understand.

Survey questionnaires are different for interviewer-administered and self-administered


surveys. In interviewer-administered survey questionnaires, an interviewer facilitates the
communication between the researcher and the respondents. Meanwhile, respondents
accomplish self-administered questionnaires on their own. As such the self-administered
questionnaires should be carefully crafted so that respondents can properly understand
their content, which then helps ensure the valid accomplishment of the survey forms.

Survey participation is a favor that you are asking from your respondents. Hence, you should
make it a point to develop survey questionnaires that are respondent-sensitive in terms of
content, language, organization of items, layout, and length.

30 © UP CMC Department of Communication Research


• Design your sampling scheme

The principle of external validity should guide sampling design and, in most cases, survey
projects must satisfy the requirements for external validity. External validity means that
survey results generated from the representative sample may be used to make inferences
about the target population.

When the characteristics of the sample are used to estimate the characteristics of the
population from which the sample was drawn, the sample must be representative of that
population in order to provide the best possible estimates of that population. As the term
suggests, a representative sample is a sample that represents the population, such that
results derived from a representative sample may be used to make inferences about the
population. A representative sample is an adequately sized and randomly drawn sample.

• Implement your survey

a. Face-to-face surveys—Survey interviewers play a critical role in interviewer-


administered surveys. The success of your survey project depends on their adequate
training and their proper execution of the survey. To facilitate the successful conduct of
interviewer-administered surveys, it is essential for you to prepare the following
materials, which are typically contained in a survey kit: interview protocol, interviewer
greeting and departure script, informed consent form, survey questionnaire, standard
tool for recording responses, optional incentive, and other collaterals such as an
identification card and/or permit/endorsement to conduct the survey.

b. Online surveys—A well-articulated and formatted questionnaire is necessary when


implementing self-administered surveys. As the self-administered questionnaire
addresses your respondents directly, survey introduction, instructions, questions, and
response options must be self-explanatory. To facilitate the successful conduct of self-
administered surveys, your self-administered survey questionnaires must contain the
following sections: introduction, informed consent form, instructions, questions,
response options, and space for responses.

• Encode, process, analyze, and interpret results

Data gathered from surveys are encoded, processed, analyzed, and interpreted. Data
generated from a representative sample may be used to make inferences about the
population where the representative sample was derived. Otherwise, data generated can
only describe the existing pool of respondents.

A Primer on Communication and Media Research 31


9. ETHNOGRAPHY
by Ma. Aurora Lolita L. Lomibao, MA

Ethnography is the art and science of describing a group or culture. It is a valuable research
method because of its strengths: reducing ethnocentrism and helping to understand complex
societies and human behavior. Ethnographic researchers must be sensitive to the concepts of
culture, context, inter- and intra-cultural diversity, and the use of symbols and rituals; as well as
adopt an emic and etic perspective and a non-judgmental orientation during the research
process.

Ethnography utilizes multiple and flexible methods. Ethnographic research can produce rich data
and interpretation. However, researchers must anticipate the challenges of rigorous
ethnographic studies in communication and media.

9.1. Basic premises

• Ethnography as interpretivist research

Very simply, ethnography describes the contexts, processes, and meanings of a community,
in their everyday settings. While this sounds quite easy, it actually entails so much planning
and actual work because ethnography, like all other methods of conducting research, has to
be a strategic activity. Ethnography aims to understand and describe a social or cultural
group or situation from the insider’s perspective—from the viewpoint of those who belong
to that group or occupy that situation. We call this the “emic” perspective.

• The value of ethnography

In general, its value is in seeking to understand the cultural context of people’s behavior, and
the symbolic meaning and significance of that behavior within that context. The
ethnographic approach is also most useful when dealing with something new, different, or
unknown. It is an excellent way of gaining insights into a culture or a social process,
particularly a) those in complex behavioral settings, b) those involving other cultures,
subcultures, and c) those of institutions and organizations.

• Ethnography and reflexivity

Reflexivity is an indispensable requirement for the student who wants to utilize ethnography
in communication and mass media research. This means that you must reflect on your own
role in the stories that you tell, examine the biases you may have regarding the topic or the
people you will collaborate with, and any physical or emotional ties you may have to your
subject of inquiry.

32 © UP CMC Department of Communication Research


9.2. Basic concepts

• An in-depth focus on culture

Ethnographers need to know about both cultural behavior and cultural knowledge to
describe a culture or subculture adequately. Here we define culture operationally as the sum
of a group’s observable patterns of behavior, customs, and way of life. In communication
and media studies, culture can also refer to a set of professional, organizational or group
practices and behaviors within a given communication context. For instance, ethnography
can be employed to study the culture of fan groups who idolize certain media icons. It can
also be used to characterize the nature and practices of a media group or institution.

• The use of a holistic perspective and emphasis on context

Ethnography assumes a holistic outlook in research to gain a comprehensive and complete


picture of a social group. Even though researchers are studying a particular aspect of a group
or culture, they must still attempt to describe as much as possible about it. This description
might include the group’s history, religion, politics, economy, and environment.

This is the impact that context and a holistic approach provide to ethnographic studies—
they enable researchers to tell more engrossing stories and foster profound awareness and
appreciation of other groups and peoples.

• Cognizance of symbols and rituals

Ethnographers look for symbols that help them to understand and describe a culture.
Symbols are condensed expressions of meaning that evoke powerful feelings and thoughts.
For example, communication and media students can study the symbols—the internet
shortcuts, or emojis—that young people use to exchange messages with their friends, as
against those they use with their parents online. Do they use different symbols for each
group? What do these differences say about their relationships?

• An awareness of both emic and etic perspectives

The emic perspective, or the insider’s view, is at the heart of most ethnographic research. It
helps the researcher to understand why members of the social group do what they do. An
etic perspective, meanwhile, is an external social scientific perspective on reality. Good
ethnography can use both emic and etic perspectives.

Most ethnographers start collecting data from the emic perspective and then try to make
sense of what they have collected in terms of both the locals’ views and their own scientific
analysis. Just as thorough fieldwork requires an insightful and sensitive cultural
interpretation combined with rigorous data collection techniques, so too does good
ethnography require both emic and etic perspectives.

A Primer on Communication and Media Research 33


• The adoption of a nonjudgmental orientation

Sometimes we bring biases into our research, especially when we are studying controversial
or moral issues, or even matters and people we feel strongly about. This bias can color the
way we approach our subjects, the questions we ask, and the way we interpret our findings.
Thus, a non-judgmental orientation requires the ethnographer to suspend personal
valuation of any person, group, or cultural practice. The ultimate goal of ethnography is to
shed one’s own biases and comprehend another way of life from the point of view of people
with different meaning systems and visions of the world.

9.3. Procedures for doing ethnography

Just like in any research project, you need to have a clear plan before you actually conduct an
ethnographic study. The writer Paul ten Have (2004) proposes three phases or tasks to any
ethnographic project:

a. The researcher has to gain permission, from academic bodies such as the adviser and the
school, and also from the site of the proposed ethnographic research. This usually means
you need an approved research proposal, and consent from the communities and the people
you will study.

b. The researcher has to ensure that, once on the field, various data are sufficiently gathered
and recorded. At the same time, the researcher’s activities must not disrupt or disturb the
ordinary and day-to-day activities of the people too much.

c. After the field work is finished, the researcher has to guarantee that the findings are
rigorously analyzed and written in a manner that is acceptable academically as well as to the
community or culture that the researcher has studied. This means that first, the study must
academically have a “convincing contribution to social scientific knowledge,” while likewise
presenting “a picture of the field that does not damage the social image of described
persons and/or collectivities too much” (ten Have, 2004).

The main thing to remember when using ethnography is that it is not one single method, but a
holistic approach that employs a family of data collection techniques in documenting the culture
of a community or a group of people. Ethnographers have traditionally used various classical
ethnographic techniques such as:
- Carrying out field work and living in the communities of their hosts;
- Observing activities of interest;
- Recording fieldnotes and observations;
- Participating in activities during observations (also called participant observation); and,
- Carrying out various forms of ethnographic interviewing

Other techniques that researchers have traditionally used include the physical mapping of the
study setting, conducting household censuses and genealogies, assessing network ties, and using
photography and other audio/visual methods. In the case of digital ethnographies, researchers
are often in mediated contact with participants rather than directly present.

Note that because of the orientation towards understanding context and meaning from the
perspectives of their hosts, ethnographic researchers must be open to the use of all data
collection techniques for understanding the human condition, and not be limited by the
boundaries of labels such as quantitative versus qualitative.

34 © UP CMC Department of Communication Research


9.4. Types of ethnography

While there are no fixed and universal typologies regarding ethnography, two general types can
be identified.

9.4.1. Classical ethnography

Classical or traditional ethnography originated from and is usually used in the fields of
anthropology and sociology. Classical ethnography relates to the accounts of social life studied
within the particular location or setting upon which the ethnography is focused. In very general
terms, classical ethnographers are concerned with everyday events, emphasize meanings and
behaviors, and ensure that they gather insiders’ or emic accounts of community experiences and
cultures.

9.4.2. Emerging types of ethnography

Over the last decade, anthropologists and other social scientists, including communication and
media scholars, have taken the basic concepts of ethnography, and have applied it in different
research settings. This is because changes in technologies, cultures, and behaviors necessitated
new applications of existing methods.

Autoethnography, a combination of the terms ‘autobiography’ and ‘ethnography,’ is “when a


researcher describes or analyzes personal experiences to better understand a cultural event”
(Croucher & Cronn-Mills, 2015). An autoethnography is almost always a first-person account of a
researcher who writes about a personal experience and how this intertwined with the culture
being studied. Many autoethnographies are written as journals, short stories, poems, personal
essays, and other forms.

Ethnography can also be used in audience and reception research. La Pastina (2005) applied
ethnographic methods to audience studies, and termed it as media engagement, or the
“fieldwork-based, long-term practice of data collection and analysis” (p. 139).

Examples of emerging types are virtual ethnography, digital ethnography, and visual
ethnography. These types are essentially ethnography as applied to online settings. They use the
approaches of ethnography to study people, communities and cultures that are formed through
computer-mediated social interaction. These types of ethnography expand the definitions of
culture to include those that are not defined by physical proximity, and communication as
occurring in more contexts than just face-to-face or group.

Paul ten Have (2004) described virtual ethnography as “the ethnographic study of online
activities as in newsgroups, chat rooms, etcetera.” Christine Hine, in her book Virtual
Ethnography (2000), stated that “Conducting an ethnographic enquiry through the use of CMC
(computer-mediated communication) opens up the possibility of gaining a reflexive
understanding of what it is to be part of the Internet... the ethnographer learns through using
the same media as the informants” (p. 10).

A Primer on Communication and Media Research 35


Digital ethnography, on the other hand, is research into the digital, material, and sensory
environment. Pink et al (2016) noted that digital ethnography “takes as its starting point the idea
that digital media and technologies are part of the everyday and more spectacular worlds that
people inhabit” (p. 7) where “the digital ethnographer observes people, things and processes as
they engage in activity traversing the online/offline” (p. 152). Some studies that used digital
ethnography involved immersion in virtual worlds, gaming, navigating through online and offline
worlds, and camera phone studies, among others.

Visual ethnography is ethnographic research into visual technologies, images, metaphors, and
ways of seeing. Pink (2013) noted the constant presence of images in people’s everyday lives,
and how they are “part of how we experience, learn and know as well as how we communicate
and represent knowledge” (p. 1). Visual ethnographers frequently work with photography, video,
and web-based media.

36 © UP CMC Department of Communication Research


10. EXPERIMENTS
by Professor Violeda A. Umali, PhD

The experiment is widely considered as the best method (the gold standard) for analyzing
causality, or the cause-and-effect relationship between variables.

Experiments have a complex design. Their resource requirements are often higher than those for
most other research methods. There are many factors that could lessen the validity of
experiment results. But their enduring popularity not only in the natural and medical sciences
but also in the social sciences—including communication and media—is sufficient proof of their
significant contributions towards helping people gain a better understanding of themselves and
the world around them.

10.1. Basic concepts

• Definition

“Experiment” can be broadly defined as the method that establishes causality by


manipulating one variable (the stimulus, treatment, or intervention, i.e., the cause) and
observing its impact on another variable (the outcome or the effect). The variable identified
to be the cause is called the independent variable (e.g., messages about capital punishment)
while the variable identified to be the effect is the dependent variable (e.g., attitude towards
capital punishment). Thus, alternatively, we can define experiment as the research method
that studies the effect of an independent variable on a dependent variable.

• Manipulation in experiments

In the context of an experiment, “manipulation” suggests two things. First, the researcher
deliberately exposes experiment participants to a stimulus, treatment, or intervention.
Second, the researcher varies how the experiment participants are to be exposed to the
stimulus/treatment/intervention. Some participants will not be exposed to it at all; they
make up the experiment’s “control group.” Furthermore, those to be exposed might get
different versions of the stimulus/treatment/intervention—for example, some might be
exposed to messages in favor of capital punishment and others, to messages that are against
it. In other words, when we say “manipulate” or “control,” we simply mean that the
researcher consciously decides on when and how the stimulus, treatment, or intervention is
to be introduced to the experiment participants.

A Primer on Communication and Media Research 37


10.2. The classic experiment

10.2.1. The basic components

There are different types of experiments. Before we discuss them, it is important for you to be
familiar with the components of the classic experiment. By “classic” we mean the experiment as
originally designed, or what is referred to as the “true” experiment.

The classic experiment has six components, as follows:

a. Treatment—the independent or experimental variable; this is the variable that the


researcher manipulates or modifies and as such is the “cause” variable in the causal
relationship

b. Dependent variable—the trait or characteristic that is expected to change as a result of an


individual’s exposure to the treatment or independent variable; the variable that manifests
the “effect” in the causal relationship

c. Pretest—the measurement of the dependent variable before the treatment is introduced in


the experiment; the first observation (O1)

d. Posttest—the measurement of the dependent variable after the treatment has been
introduced in the experiment; the second observation (O2)

e. Experimental group—the group that receives the treatment

f. Control group—the group that does not receive the treatment

10.2.2. Other requirements

Aside from the components discussed above, the classic experiment has other requirements that
researchers must fulfill, namely:

• Random assignment of participants

In true experiments, the participants (respondents) are assigned to a group (experimental or


control) by flipping a coin or by generating a set of random numbers. In other words, the
assignment to groups is not based on the preference of the researchers or the participants.

• Equivalence of the experimental and control groups

True experiments require experimental and control groups to be equivalent—i.e., they have
the same profile, or that the members of one group have similar characteristics as the
members of the other group.

The equivalence of groups is easier to establish in laboratory than field experiments. In a


laboratory setting, the researcher has some control on how to assign the participants to the
experimental and control groups. In a field setting, however, it is impossible to have two
areas (experiment sites) that are 100% similar.

38 © UP CMC Department of Communication Research


10.2.3. Conditions for causality

Before researchers could claim the presence of causality in their experiment, they must show
proof that the following conditions have been met:

• Equivalence of groups in the pretest

As discussed earlier, the participants in the different experiment groups should have the
same profile. First, they should have similar socio-demographic profile. For example, if the
experimental group is composed of young, well-educated, and city-based participants, the
control group should have the same composition. Second and more important, prior to the
administration of the stimulus/treatment/intervention to the experimental group, all groups
should have the same profile in relation to the dependent variable.

• Non-equivalence of groups in the posttest

While different groups should have the same attitude and behavior profile during the
pretest, they should have different attitude and behavior profiles after the
stimulus/treatment/intervention has been introduced to the members of the experimental
groups.

• Absence of treatment effects in the control group

The term “absence of treatment effects” simply means that there is no change in the
members of the control group with regard to their profile for the dependent variable.

• Presence of treatment effects in the experimental group

Since the experimental group participants are the ones who receive the
stimulus/treatment/intervention, it is expected that there will be a change in their profile
vis-à-vis the dependent variable.

10.3. Experimental designs

There are three main types of experiments: 1) true experiments, 2) pre-experiments, and 3)
quasi-experiments. Under each type, there are different kinds of studies that could be
conducted. The experiments differ from each other in terms of the way they are designed and,
consequently, the extent to which they can establish causality.

• True experiment

The true experiment conforms to the classic experiment discussed above.

A Primer on Communication and Media Research 39


• Pre-experiment

A pre-experiment lacks some of the crucial features of a true experiment. First, there is no
control group used; when there are two or more groups to be tested, equivalence of groups
and random assignment of participants are not required. Additionally, the conduct of a
pretest is optional. For these reasons, pre-experimental studies are much easier to
implement than true experimental designs. However, they provide little evidence of
causality.

• Quasi-experiment

The quasi-experiment is like the true experiment in that it also uses a well-defined control
group, and the researcher implements protocols to ensure and confirm the equivalence of
the experimental and control groups. However, in quasi-experiments, random assignment is
not strictly followed, often because random assignment is not possible or is difficult to
achieve, which is usually the case with studies that involve people as participants. As such,
quasi-experiments are the most suitable for social science studies.

10.4. Procedures for doing experiments

10.4.1. Conceptualizing the experiment

• Determine if the experiment is appropriate for the study that you want to do

We have to start with the basic question—should you do an experiment? If you are
interested in determining the impact of an intervention (independent variable), which you
will introduce (manipulate), on people’s perceptions, knowledge, attitude and/or behavior
(dependent variable), then the experiment is your first choice for the research method to
use. You then have to assess the resource requirements—time, money, your
competencies—of doing the experiment and see if you can meet them. Determine which
resources you already have, which ones you do not have yet but can access or acquire, and
which ones might be difficult for you to obtain. Be realistic in your assessment, but don’t be
afraid to dream, either.

• Complete the basic conceptualization process

Like any other research project, doing an experiment begins with the basic steps of
articulating the focus of the study and its significance, the statement of problem and
objectives, the review of the related literature, and the construction of the study framework.

10.4.2. Planning your experiment

• Choose the type of experiment (pre-, true, or quasi-experiment) and specific type of study

As earlier discussed, among the three types of experiments, the quasi-experiment is most
often used in communication and media research, as well as in other social science fields,
because it has less stringent requirements than a true experiment but is more robust than a
pre-experiment. There are already many published studies that used the quasi-experiment;
you could look them up to guide you in your own research project.

40 © UP CMC Department of Communication Research


• Choose your experiment setting (laboratory vs. field)

Study the comparative advantages and disadvantages of the two settings to guide you in
your decision.

• State the hypotheses to be tested

There are four basic hypotheses tested in experimental studies, as follows:


- H1: There is no significant difference in the pretest scores of the control and
experimental groups. (Equivalence of groups in the pretest)
- H2: There is a significant difference in the posttest scores of the control and
experimental groups. (Nonequivalence of groups in the posttest)
- H3: There is no significant difference between the pretest and posttest scores of the
control group. (Absence of treatment effect in the control group)
- H4: There is a significant difference between the pretest and posttest scores of the
experimental group. (Presence of treatment effect in the experimental group)

The final set of hypotheses will vary from one study to another. First, the formulation of the
hypotheses will depend on the theoretical framework of your study. Second, there will be
more hypotheses when there are more experimental groups and treatments.

• Decide how you will design the study’s intervention or treatment

Decide how you will manipulate your independent variable. This entails such decisions as
what materials you will produce/prepare, what messages these materials will contain, how
you will package these materials, your timetable for the different activities involved in the
preparation of the materials, etc.

• Identify who your experiment participants will be

Your research problem and objectives will guide you in identifying who your participants
should be. Start by identifying the socio-demographic characteristics of your experiment
participants–such as their age, socio-economic status, educational attainment–and then list
the other participant characteristics pertinent to your study.

• Decide how to introduce the treatment/intervention

Keep in mind that: a) in a pretest-posttest study, you can only introduce the
stimulus/treatment/intervention after the pretest, b) the control group, if ever there is one,
will not be exposed to the stimulus/treatment/intervention, and c) if there are multiple
stimuli, you should determine the time lag between them.

• Decide how the data will be collected and design the appropriate research instrument

Most experiments make use of the survey method for the pretest and posttest data
gathering. The principles and procedures for conducting a survey are discussed in the survey
section. In some experiments, observation is also used to gather data. Whichever method
you decide to use, remember that you should use the same instrument (survey
questionnaire or observation protocols) for the pretest and the posttest, and for your
control and experimental groups.

A Primer on Communication and Media Research 41


10.4.3. Implementing your experiment

• Set up the experiment setting

If you are doing a laboratory experiment, recruit your participants and assign them to the
control or experimental group. If you are doing a field experiment, finalize your study sites.

• Gather your pretest data (if applicable): To reiterate, not all experiments have a pretest
(O1) phase

• Introduce your intervention/treatment to the experimental group/s

• Gather your posttest data, after the time lag specified in your design

Note that some experiments do not implement a time lag, i.e., they administer the posttest
immediately after the stimulus/treatment/intervention is introduced. This is usually the case
with laboratory experiments but could also happen in field experiments.

• Debrief the experiment participants

Debriefing is necessary when participants, usually for laboratory experiments, are not
informed beforehand about the purpose of their participation in the experiment. After the
experiment, the researchers should explain to the participants why the experiment was
conducted and why they (participants) were selected. During the debriefing, the researcher
should obtain permission from the participants to use the data that they provided for the
study. If the participants refuse to be included in the study, they must be excluded from the
study sample. Incidences of refusal should be included in the experiment report.

In field experiments, debriefing includes a proper exit from the communities that served as
study sites. You should notify the concerned officials and/or offices about the completion of
your experiment.

• Analyze the data collected and test the hypotheses using the appropriate statistical tests

• Interpret the results of your hypothesis to determine the presence and extent of cause-
and-effect relationship

Keep in mind that absolute statements about the presence (and conversely, absence) of
cause-and-effect relationships are only possible when you conduct a true experiment and
have complied with all its requirements. Any deviation from the protocols of the true
experiment becomes a limitation on the extent to which you can claim the existence of
causality in your experiment.

42 © UP CMC Department of Communication Research


11. RECEPTION ANALYSIS
Assistant Professor Jon Benedik A. Bunquin, MA & Assistant Professor Randy Jay C. Solis, PhD

Reception analysis investigates how audiences understand and utilize media content (or "text").
It extends to how audiences are influenced by media text. This methodology recognizes the role
of an active audience in their interpretation and use of media texts, and the influence of text on
audience’s views, attitudes, and behaviors. In conducting reception analysis, text is viewed as
constantly interpreted, reinterpreted, and recontextualized by audiences or readers. However,
factors such as sociodemographic characteristics can define how audiences define and repurpose
content. These factors can form patterns of practice, which are typically examined by researchers
conducting reception analysis.

Reception analysis is a mixed paradigm research method. According to Pernia (2004), Reception
Analysis:
- Can be approached both quantitatively and qualitatively
- Is informed by positivism (in studying media effects) and interpretivism (in studying audience
interpretation and audience use of media texts)
- Situates the investigation of audiences in its cultural/social, as well as
psychological/individual dimensions

11.1. Basic concepts

• Text

Text refers to anything that can be read—it could be an article in a newspaper, a scene on
the television, a social media post by a friend, or a blog post by an online influencer. Text
carries meaning, and the way text is written, acted out on screen, ranted out online, or
posted in a blog have certain implications on those at the receiving end.

Does this mean that text, written and packaged to meet a desired end, will uniformly
influence audiences? Scholars of reception analysis believe otherwise. In reception analysis,
text is constantly interpreted, reinterpreted, used, and repurposed by audiences. Text is
polysemic, which means that they are open to multiple ways of reading and interpretation.
Moreover, audience characteristics come into play in the process of text interpretation,
usage, and sometimes even influence.

• Audience

Conceptualizing the audience in reception analysis is different from the dominants paradigm
in communication and media research, specifically in terms of level of activity and
composition of audiences. Scholars doing reception analysis stay away from early concepts
of audiences as comprising one homogenous mass.

In terms of level of activity, reception analysis views audiences as active agents in the
communication process. This contrasts with the concept of audiences in linear models of
communication. When doing reception analysis, scholars keep in mind that audiences
interpret texts differently, and consequently, they construct their own meaning from media
texts.

A Primer on Communication and Media Research 43


Meanwhile, when it comes to composition, researchers doing reception analysis also look at
the concept of audiences (multiple decoders/interpreters of media text, as opposed to
audience). Audiences are perceived to be heterogenous, each with their own unique use and
interpretation media texts.

• Reception

Reception links text to audiences. It refers to the use and interpretation of text by audiences.
According to Stuart Hall, reception involves two processes:

a. Encoding, which is the active construction of a message by individuals before it is shared


with others. Although this part of the reception process is typically not studied in
reception research, recognizing that text is a product of encoding by individuals is
important to contextualize the analysis of data.

b. Decoding, which is the recipients’ processing of the message that is shared with them.
Decoding is premised on the idea of an active audience, and the interpretation of
information may vary from one individual to another. These variations are caused by
specific characteristics and contexts.

Stuart Hall identifies three ways by which audiences interpret or make sense of texts, based
on the similarity (or symmetry) of beliefs between the sender and the audience:

a. Dominant reading, which happens when the intended message by an encoder is aligned
with the message received by the decoder. This usually happens between an encoder
and a decoder who share similar ideals and beliefs.

b. Negotiated reading, which happens when the decoder accepts (or rejects) some parts
of a message from the encoder. A certain level of misunderstanding occurs between an
encoder and a decoder because they have some differences in ideals and beliefs.

c. Oppositional reading, which happens when the decoder totally rejects a message from
the encoder. While decoders understand the message as encoded, they may interpret
and construct meanings from the message differently. This is brought about by
contrasting ideals and beliefs between encoders and decoders.

• Mediated construction of reality

We know from social constructivism that everyday reality is constructed, which means that
our experiences are not naturally given. It is through our interactions with others that our
realities are shaped and re-shaped. Social practices that are consistently performed and
widely accepted become institutionalized in the society, crystallizing their place in the
realities of wider populations (Couldry and Hepp, 2017).

We get to learn about these practices and more through our socialization with others. By
communicating with other people, we also get to assign or derive meanings from these
social realities. In this regard, communication is able to maintain social realities, and our
language becomes the medium in delivering meaning to others in conversation. These
contribute to shaping the way we understand (or make sense of) our everyday experiences.

44 © UP CMC Department of Communication Research


As our communication becomes more mediated with the developments in communication
and media technologies, we become more exposed to content outside our immediate social
realities, such as the practices of other cultures or communities, and the different meanings
behind messages. Reception analysis examines how texts, as representations of different
realities communicated through the media, are utilized by audiences in shaping and
reshaping their own realities. Researchers of fan studies, for example, might look into the
development of fandoms and how fans utilize media text in the creation and co-creation of
fan cultures through repurposing content from and for TV, print, radio, and other forms of
media (Jensen, 2012).

• Media effects

In reception analysis, media effects are usually studied as negotiated, as opposed to a direct
effect of a stimuli. This means that media effects, studied through reception analysis, are a
product of, on the one hand, the predictable and patterned ways by which media texts are
presented and, on the other hand, the socially constructed realities of audiences (McQuail,
2005). The key concept that undergirds this type of media effects studies is, again, the
conceptualization of an active audience.

11.2. Procedures for doing reception analysis

Reception analysis is considered as a mixed-paradigm research; that is, it is informed by both the
positivist and the interpretivist paradigms in research. Hence, researchers doing reception
analysis have an array of research techniques they can employ in studying how audiences
interpret and use media texts, as well as how they are influenced by them.

In this section, we learn about the various procedures involved in studying reception, and unpack
the various techniques commonly used in communication and media research in inquiring about
audiences and text. Reception analysis utilizes a number of data gathering techniques, depending
on the point of inquiry of the researcher. Do note only qualitative data collection techniques are
discussed in-depth here, as there are separate discussions on surveys and experiments.

• Conceptualize the research

Reception analysis puts prime focus on audiences, linking them to the texts they consume as
well as to the various social, cultural, and psychological mechanisms that operate in the
process of consuming media texts. As with most research studies, it is important to arm
yourself with literature and exploring possible audience-centered topics that can help us
understand the active audience better.

Part of the conceptualization process in reception analysis is examining which aspect of


reception you want to study. In this chapter, we learned that there are three types:
interpretations, uses, and effects. Knowing the type of reception study that you want to do
also helps you as a researcher identify the theory and design most appropriate for your
inquiry.

A Primer on Communication and Media Research 45


In conceptualizing reception studies, consider the following elements in the communication
process:
- The type of text, examining how this type of text is interpreted and utilized by
audiences, as well as examine its possible effects on the audience.
- The characteristics of audiences, probing into how these characteristics intersect with
the use, interpretations, and effect of texts

• Design the research

Once you have a) decided on the concepts that you want to explore, b) examined the related
literature on the subject, and c) identified the theoretical approach to reception, the next
logical step is to design the research. As mentioned throughout this section, reception
analysis is considered as a mixed paradigm method that approaches communication- and
media-related phenomena using both quantitatively and qualitatively.

11.2.1. Qualitative techniques in reception analysis

In-depth interviews, focus group discussions, and participant observations are used in reception
analysis to explore the various ways by which audiences consume media text. These methods are
commonly employed in two types of reception analysis: the use of texts and the interpretation of
texts.

In-depth Interviews

In-depth interviews (IDI) are known as conversations with a purpose (Burgess, 1984). In the
context of reception analysis, IDIs unravel detailed responses and elaborate data from
informants regarding their interpretation and use of text. In-depth interviews are useful in
reception analysis because this technique explores individuals’ perspectives, perceptions, and
opinions regarding different forms of texts, specifically those that they consume. In-depth
interviews are also more appropriate when researching about audience topics that can be
considered as sensitive, or those which may contain highly personal and private information.
Below are the steps in conducting in-depth interviews.

• Select informants

Informants are chosen based on the audience characteristics being explored by the
researcher. Researchers doing IDI may get a diverse set of informants, using maximum
variation sampling, to examine intersections between interpretation and use of texts, on the
one hand, and their characteristics as audiences, on the other. Meanwhile, researchers who
opt to focus on niche audiences may want to employ homogenous sampling, which means
getting informants who share a specific characteristic.

• Set the interview time and date.

Once you have an initial set of interview informants, discuss with the informants the possible
time and venue where they could feel most comfortable in sharing their thoughts, insights,
and experiences in their interpretation and use of texts. Having said that, you must also
consider your own comfort. The interview venue must work both for the researcher and the
informant.

46 © UP CMC Department of Communication Research


• Prepare the interview guide and interview materials.

Asking the right questions requires comprehensive background about the subject. For
researchers, preparing for the interview means searching and reading up on the literature.
Learning what has been written about the media or content being studied can arm the
researcher with insights on formulating interview questions.

Semi-structured interviews work best for novice researchers and are less restrictive as
compared to structured interviews. In terms of sequencing, semi-structured interviews are
open to deviations from the order of questions, especially if the informant wanders off into
the topics to be asked later in the interview. When it comes to wording, semi-structured
interviews may have no precise wording, but employ themes or topics to be asked by the
interviewer. This type of interview can establish better rapport with the informants since it
feels less of an interview and more of a conversation and yields richer insights from the
respondents than structured interviews. Moreover, since this type is open to deviations from
the guide, some questions asked may be different from one informant to another.

The interviewer must also prepare the voice or video recorder, paper for taking notes, and
token. In instances when the researcher needs to show video clips (e.g., for a reception
study of advertisements or music videos), the availability of the needed equipment must be
ensured, and their technical requirements and compatibility checked beforehand.

• Conduct pilot interviews

Before conducting actual interviews, it would be best to pretest the interview guide with
peers who are familiar with the research. These include people who have previously
conducted similar studies or who are members of the audience group being studied. This can
help you become more aware of certain nuances that may indicate red flags on the content
or process of the interview, such as insensitive language in the questionnaire as indicated by
some body movement from the interviewee. Moreover, this can also manage the
impressions you make on your informants, particularly among minority groups or special
interest groups.

• Conduct the interview

Before the actual interview begins, the interviewer must seek the informants’ consent
regarding the recording of the interview session. Only then can you proceed with the
interview.

Warm-up questions are asked at the beginning of the interview. These questions could be
about the informants’ profile—their background, current engagements, and other light
topics they could discuss before diving into the concepts being explored through the
interview. This helps establish rapport between the interviewer and the informant, set the
mood of the interview, and improve the quality of the interview session in yielding insights
from the informant.

A Primer on Communication and Media Research 47


The showing of the media content being examined in a reception study may be done at the
start of the interview to set the theme of the succeeding questions. However, to decipher
the audience’s engagement with the text more deeply, the media content may be shown
time and again throughout the interview. It is important to emphasize, however, that not all
reception studies require the showing of the media content involved in the analysis. If your
study requires the showing of a media material, decide when it will be best to show it.
Should you show it at the beginning of the interview or somewhere in the middle? You
should also decide how often you will show the material. Do you want to show it only once
or will it be better to show it (or parts of it) several times?

Throughout the interview, direct the conversation and cover the themes and items as
indicated in the interview guide. In addition, probe answers provided by the informant. This
can help clarify and enrich the responses of the informants during the conversation. You may
also provide follow-up questions whenever necessary.

• End the interview

You may ask informants if they have any questions regarding the interview or even the
research itself. Thank the informants and give them a token of appreciation for their time
and effort for the interview.

Focus group discussions

FGDs are the most commonly used research method in analyzing audience reception. As
compared to interviews, FGDs examine how an individual’s view relates (or not) or interact (as in
social interaction) with that of another in a group discussion. Below are the steps in conducting
FGDs.

• Prepare for the FGD

A team is typically formed in preparation for a focus group discussion. Depending on the
scope of the research and/or the size of the focus groups, an FGD team may also recruit an
additional co-facilitator and another assistant. The members of the team take on the
following roles:
- Facilitator/Moderator takes control of the session and directs the FGD
- Documenter/Observer takes down notes, including non-verbal information
- Co-Facilitator helps in controlling the session, asks questions missed by the main
facilitator
- Assistant takes care of the venue, equipment, materials, snacks, and other needs

• Select FGD participants

In selecting FGD participants, remember to limit the group size between 6-12 participants,
because this is the optimal size for a facilitator to be able to efficiently manage the
discussion. The researchers may select participants with similar characteristics (homogenous
sampling) or select participants with varying characteristics (maximum variation) under one
major criteria.

48 © UP CMC Department of Communication Research


Ideally, select group members who represent the target population. It is suggested to select
participants who are not familiar or within proximity (i.e., residence, departments, etc.) of
each other so that the sharing and discussion are more free-flowing, and the perspectives
are more diverse and comprehensive. In reality, however, there are instances in which FGD
participants know each other. For example, when examining reception of community
campaigns, it is highly likely that participants of an FGD will come from one neighborhood.

• Set the FGD time and date

The FGD team must pick a venue that is accessible to the participants. It must be relatively
quiet and comfortable, and the environment stimulates open communication and dynamic
interaction. Remember that the ideal FGD duration is between 90 minutes and two hours.
Thus, the time and venue must be a) convenient for the informants to agree on and b)
conducive for them to participate in the entire duration of the FGD.

• Prepare the FGD guide and FGD materials

The researcher needs to ensure that the questions in the FGD discussion guide or
questionnaire are based on the framework and conceptualization of the research. But while
a semi-structured questionnaire ensures that the FGD is “focused,” the FGD team must also
be ready with contingency questions and probing or follow-up questions. The FGD team
must also prepare the voice and/or video recorder, notebook, ballpens, name tags, snacks,
and tokens.

Researchers doing reception analysis are also interested in how audience characteristics are
related to their interpretation and use of texts. Hence, it is also important to include a profile
sheet which contains these audience characteristics.

In the same way as in an interview, should you want to show media content—such as
audiovisual clips or Internet websites—to examine audience reception, the technical aspects
must be checked ahead of time.

• Conduct the FGD

Before the actual discussion, a member of the FGD team must see to it that attendance is
checked. The participants may fill out an attendance sheet or a more detailed individual
information sheet before being given name tags where their nicknames are indicated. The
nametags aid familiarity especially for the facilitator to manage the discussions later on.

As soon as the participants are done with the attendance sheet and/or profiling sheets, and
they have settled well in their seats, the facilitator may now start with the introductions of
the participants. The facilitator may use an icebreaker or any creative game for this. The
members of the FGD team must also introduce themselves before explaining the purpose
and the process of the FGD. A set of house rules may also be created: the “do’s and don’ts”
in the FGD session.

A Primer on Communication and Media Research 49


It is important that after explaining the purpose of the FGD, the facilitator should seek the
participants’ consent to have the session recorded. After this, the group may now start with
the actual discussion. The facilitator may start with a warm-up exercise or discussion starter
before proceeding with the first question in the questionnaire. For the first question, it helps
if the facilitator first calls the most open and energetic participant in the lot. This helps
establish the mood of the entire session.

Should you feel the need to show the media content to be examined, this may be done at
the start of the FGD as a stimulant for the entire discussion. The media content may also be
presented at various stages of the discussion, for probing purposes or to draw attention to
specific elements in the contents.

All throughout the discussion, the facilitator must direct the flow of the responses by
bridging and connecting the responses of the participants. This may be helped by using
probing or follow up questions depending on the flow of the responses of the participants,
especially when the responses are too short or unclear. Every now and then, the facilitator
may paraphrase what the participants are saying to not only show that the facilitator is
attentive and listens to the participants, but also to check if the facilitator understood the
participants well. Towards the end, the facilitator and co-facilitator may summarize the
important points of the discussion, especially before closing the FGD session.

• End the FGD

Before ending the FGD, the facilitator may ask if the participants have any questions or
clarifications on the discussion, matters raised during the discussion, or the entire process of
the FGD or research. After this, the FGD team must thank the participants for sparing their
time for the research, and, if available, give the token of appreciation.

Participant Observation

Participant observation is usually used together with interviews or as a supplement to FGDs to


confirm or validate responses, especially because interviewees and FGD participants may be
prone to social desirability or the halo effects. PO allows you to gain insights by directly noting
what is actually being practiced—patterns of media consumptions and reactions to media
contents—rather than what is being uttered. The following are the steps in conducting POs.

• Prepare for the field

Just like in FGDs, it is important to seek permission and help to be able to conduct the PO,
especially as POs ought to be conducted in the informants’ natural setting. A courtesy call of
community leaders may be organized where the research goals and processes are explained.
This courtesy call may also be a good start to identify key informants (KI) and contacts to
help you conduct the PO in the field. It is crucial though to pick your KIs well as there is
possibility of bias or influence that may impact on your interactions with other participants
in the study. Before going to the field, it would also help to read the literature and study the
field. This is useful for two reasons. One, the researchers may realize their own biases about
the community and may try to remove or downplay these assumptions before entering the
field. Second, reading the literature and studying the community may help the researchers
plan the timing and venues of observations and may also help them develop unstructured
interview questions together with the observation plan.

50 © UP CMC Department of Communication Research


• Sample or choose the participants

In POs, informants or participants are mainly chosen through purposive sampling or the
selection of participants based on their characteristics and who may yield the most
comprehensive understanding of the reception study. This may be strategized further by
following quota sampling or by selecting individuals in different categories (such as heavy,
moderate, or light viewers of soap operas). Another strategy is to do snowball sampling:
referrals from existing informants may help the researcher locate a relevant subject, for
instance the most influential person in the community when it comes to new and upcoming
soap operas. Another sampling strategy is to select deviant cases that challenge, and
therefore illuminate further, the regular patterns of consumption and reception of the
media content being studied.

• Conduct the PO

In POs, recording the observations is most important. However, this poses a great dilemma
to researchers as recording “on the spot,” such as taking down notes or documenting using
recorders and cameras, may obstruct the natural dynamics of the people in the field.
Participants may become too conscious that they are being observed (also known as the
Hawthorne Effect) and may “contaminate” the observation process. On the other hand,
waiting until after the field visit to document the observations may result to loss of some
vital information. Thus, the researcher is encouraged to practice flexibility with regard to
recording observations, matching “right-there-right-now” context of the researcher with
that of the integrity of the data that they are gathering. When recording the observations,
take note of the location, the duration and frequency of the observation, the demographic
information about the participants, as well as their behaviors and practices, particularly in
relation to their media consumption and engagement with the media content.

11.2.2. Quantitative research techniques

Earlier, you learned how to conduct reception analysis qualitatively. But the audiences’
consumption of content can also be investigated using quantitative research techniques.
Typically, quantitative reception analysis look into the third form of reception analysis studies,
the effects of text, which is drawn from media effects research.

Campaign planners can benefit from reception data by understanding how their audiences are
influenced by messages. This can maximize the persuasive power of the advertisements they
send out to potential consumers, by designing media-relevant messages and more efficiently
targeted campaigns. Advertisers can focus their ads to target audiences, by knowing what
specific TV shows they like to watch. They can even venture into production of content for other
forms of media once they know which ones generate high viewership and high engagement
among audiences. Organizations implementing campaigns examine differences among audiences
of media and maximize such information when coming up with media strategies and messages.

Quantitative reception analyses can make use of two popular quantitative research methods:
surveys and experiments. The following discusses how you can implement these methods in
studying audience reception.

A Primer on Communication and Media Research 51


Surveys

Surveys are used in reception research to get insights regarding the associations between media
and content use, on the one hand, and audience behavior, on the other. While the section on
surveys provides an extensive discussion on surveys, as well as a step-by-step elaboration on
conducting surveys, some variables must be considered in the development of a survey research
concerning audience reception:

- Socio-demographic characteristics, such as age, gender, socio-economic status, and


ethnicity are usually included in survey research. In reception analysis, these characteristics
are used to segment and compare differences of media use and media content experience
based on population characteristics.

- Media exposure, such as average hours of consuming TV, radio, newspaper, film, outdoor
media, or social media content are examined to measure viewership or readership of media.
You may also focus on a specific media format to examine audience use of such media. New
research on media use probes into emerging and unexplored types of media. Media
consumption habits are also asked under media exposure, which look into information
sources, information recall, and perception about content.

- Media usefulness can also be included in reception surveys. The uses and gratifications
survey by Katz, Gurevitch, and Haas (1973), which led to the development of the uses and
gratifications theory, asked “How important is it for you to…?” followed by 35 statements
about different human needs (e.g., to spend time with friends, to keep with the way
government performs their functions). Each statement was then followed up by a question
“how does [media] help you to [human need]” (Bracken & Lombard, 2001).

- Knowledge, attitude, and behaviors which are asked typically in surveys, are used in
reception analysis to establish the link between text and audience processing of information.

Experiments

In reception analysis, experiments are used to test the effect of text on audiences. Earlier, you
learned the different types of experiments and the steps in conducting them. You might recall
from this section the various local cases from which you could draw inspiration in designing your
own communication and media research experiment. In essence, most of these studies are
reception analysis, because they examine the role of texts and how it influences audience
behavior.

What separates an experiment based on reception analysis is its consideration of the various
social, cultural, and psychological traits of audiences, and how these come into play when
examining the effects of messages. Usually, a survey form is administered to measure these
concepts. When it is administered depends on the design of the experiment.

52 © UP CMC Department of Communication Research


12. CASE STUDY
Associate Professor Jonalou S.J. Labor, PhD

Case study research as a method and an approach is used in the social sciences as a means to
answer contemporary research questions. The rise of mixed method studies has resulted in
renewed scholarly interest on case study as it provides a) contextual description and in-depth
analysis of a specific issue as well as b) explanations of causality as regards a communication
phenomenon in a real-life setting.

Case studies provide practical and context-based knowledge. They help researchers explain the
development of individuals, organizations, communities, and, eventually, societies. Case studies
are necessary in exploring communication phenomena, especially in a) generating and testing
hypotheses, b) building and solidifying theories, and c) confirming propositions using specific
cases.

12.1. Basic premises

Flyvbjerg (2006) notes that case study research has the power to provide an in-depth
understanding of how processes work and why effects happen after a successful (or unsuccessful)
exposure to a certain text, event, or even a social phenomenon. The mixed-paradigm approach
of case study research allows for both the description of a process and the investigation of
causality within the unit of study.

12.1.1. Case study vs other methods

• Differentiating process documentation: the case study vs. an ethnography

The interpretivist nature of case study research allows a researcher to look into a
communication phenomenon from an informed standpoint. This means that the interaction
between the phenomenon and the inquirer creates meaningful interactions that aid in the
interpretation and analysis of a social event. Both an ethnographer and a case study
researcher have this kind of an engagement with their subject matter. Epistemologically, the
interpretivist nature of ethnography and case study research is similar in the sense that both
argue that a social world exists because of the co-constructed experiences of those who live
in it.

The difference lies, however, on the way the researchers focus on what to study and which
voice to use in the analysis. Case study researchers want to study a phenomenon because
they want to document a state of event and a process. Unlike an ethnographic work that
documents rituals and the understanding of participants as regards a communicative event,
the case study is able to explain the acts and events participated in by an individual or a
group from the lens of the researcher. Moreover, case study researchers can explain the
reasons for individual action or behavior.

If ethnographers are able to culturally interpret the practices of a group of individuals, case
study researchers are able to explain how and why social events happen. Case study
researchers use various perspectives in order to provide a holistic understanding of the
situation.

A Primer on Communication and Media Research 53


• Establishing causality: Case study vs. other positivist methods

The section on experiments discusses how causality may also be studied using case studies.
It explains the focus on causality in case study research is defined by its analytical approach:
process tracing by finding the causes of effects. By looking into multiple sources and by
multiple methods or data-gathering techniques such as in-depth interview, document
analysis, message analysis, and participant observation, the researcher can establish an
occurrence of meaningful events that potentially leads to a conclusion.

12.1.2. Definition

• What is a case?

In social science research, there seems to be varying notions of what a case should be. Ragin
and Becker (1992) forwarded the notion that a case may be theoretical or empirical or
methodological construct or object or a process. They stated that cases are identifiable
elements of a system being studied. Such a system could be an organization or a community.
Ragin and Becker also mentioned that cases are objects, too. These are pre-existing
representations of an empirical body.

Cases are also seen as conventions. In this sense, cases are considered as theoretical
constructs. This means that a case becomes one through an agreement from a collective
body of scholarly work. For instance, considering blogs and vlogs of women as cases for a
study on online media representation on femininity becomes cases because there is an
agreement among scholars in a socio-scientific community that blogs and vlogs as cases and
subjects of online representations of women.

Dumez (2015) appeared to contradict the notion that cases have boundaries as previously
mentioned by past scholars. He stated that a case has a narrative essence in relation to a
theoretical issue. He further said that three fundamental questions must be asked before a
researcher proceeds in doing a case study.

Finally, it is important to remember that cases could also be groups, institutions,


communities, and event texts produced by people and circumstances. The bottom line here
is that cases should be representative of the population of the social unit being studied. In
the words of Aristotle: “Definitio fit per genus proximum et differentiam specificam (the
definition proceeds by the nearest genus and specific difference).”

• What is a case study?

Case studies are not mere research methods but are approaches in analyzing data. Fidel
(1984) mentioned that a case study researcher goes out in the world to look for descriptions
of a communication event. This means that the researcher is an analytical viewer of events
that commits to the examination of the complexity of a real-life situation.

Case studies were initially used in the fields of psychology and political science, especially in
the testing of newly developed forms of interventions. The case study approach was the
appropriate method because the researchers could observe and record changes in a case
after an intervention was introduced to it. There were researchers who used the method to
test research hypotheses (Naumes & Naumes, 2006). This meant that already established
theoretical arguments that came from previous researches or new ones that argued for or

54 © UP CMC Department of Communication Research


against a framework were subjected to analysis. Of course, the mindset here was not to
generalize but to check the factors that led to a particular effect.

Case studies are also a method of construction (Baxter & Jack, 2008). They merge integrated
accounts of people and experiences in order to create an in-depth and multifaceted view of
the phenomenon. The malleability of case study research in doing explorations and
establishing causations directly became the strength of this method. Its mixed-paradigm
orientation allows the researcher to provide descriptions to communicative events and
establish causality among factors and conditions.

In the communication and media research context, the case study is used to examine a
phenomenon, with the goal of documenting what the phenomenon is, how it develops or
degenerates, and why it grows or declines. Doing a case study means undertaking a detailed
investigation of the contexts and processes underlying a phenomenon. When
communication scholars adopt the method to study communication and media events, they
emphasize that the case study is useful as an empirical investigation of contemporary
communication phenomena (Rowley, 2002; Yin, 2003). Contemporary, in this context, means
that a case study is used to explore how individuals practice, adapt, and eventually live in
realities that are previously undocumented and unresolved. Contemporary may also mean
that certain communication innovations that are introduced to an individual, a group, or a
community need assessment.

The pragmatic nature of case study research in the area of communication and media
studies can be used in providing a well-developed set of thematic descriptions of behaviors
and events (Hancock & Algozzine, 2006; Sturman; 1997). The contextual nature of the case
study allows a researcher to examine each part of the situation in its real-life context. The
use of conceptual categories enables a researcher to be guided in analyzing a phenomenon.
The researcher who has knowledge about the phenomenon and a pre-understanding of the
context or situation can very well construct thick descriptions of the situations.

Case study research is also known as a deep observation of an “individual” unit in relation to
a phenomenon (Suryani, 2008). This means that a case, being a representation of a specific
real-life situation, provides an illustrative dimension to a population that is hard to study in
its entirety. Doing a case study is providing an objective description of an incident, situation,
or an occurrence.

A case study is also more than a descriptive method. It is also a way to understand the causal
relationships among the factors that lead to the outcomes of an event. Flyvbjerg (2011)
notes that the case study is used when a researcher wants to study find the influences that
determine the results in the individual unit. It looks into the boundaries of the case too so
that the researcher could discriminate which factors led to the effect. Because of this basic
goal, “case studies comprise more detail, richness, completeness, and variance—that is,
depth—for the unit of study than does cross-unit analysis” (p. 301). The intensive nature of
the method, together with the idea that the researcher can see the connections among
factors in one or multiple cases, makes case study ideal in explaining and evaluating a
communication and media problem. The very nature of a case study is comparison (Dumez,
2015).

Finally, the Merriam-Webster’s dictionary (2018) defines case study as follows: As an


intensive analysis of an individual unit (as a person or community) stressing developmental
factors in relation to environment.

A Primer on Communication and Media Research 55


Thus, as a method, case studies provide a holistic look into the nature and process of the
communicative event being studied. In the field of communication and media studies, case
study research can be applied in studying group processes and structures. Case study
research can also look into the contribution of mediated communication materials and texts
to the everyday life of humans.

Taking account of all these points, case study research in communication and media studies
is a deliberate strategy that must be rounded to establish a stable conclusion. Roundedness
here means that multiple methods must be used to establish a) firstly an occurrence and b)
eventually a recurrence of factors that explain a communication phenomenon.

12.1.3. Characteristics of a case study

Once the researcher has identified the case, the next task is to define its boundaries. The
following are the features of case studies:

• Not rigidly planned

A researcher who would like to embark on a research that uses the case study method
should be able to embrace the idea that he or she is venturing on an exploratory journey.
This means that the researcher, armed with knowledge about the subject matter, must have
an open mind in documenting what is out there in the field. There should be a tolerance to
any unforeseen scenario and that the events that are lived by the case or cases would lead
to the right findings. Given that the phenomenon under investigation guides the conduct of
the research, the researcher should be able to record and create an insight out of what is
seen (and not seen) in order to understand the observed situation. Case study research
provides less control over the variables that are under investigation.

• Detailed

Since the aim of a case study is to look for patterns of regularities, the method allows the
researcher to explain the process and understand why certain conditions occurred. A case
study offers descriptions of larger details about a phenomenon. The narrative that a
researcher draws from the case leads to an interpretation of the situation under study.

• Contextual

This communication research method acknowledges the fact that a phenomenon can be
studied in its entirety without the necessary requirement of a replication. There is also an
acceptance of the fact that its analytical approach can only go as far as tracing the process
and not to determine effects of certain causes. The non-replicability of the research is largely
because the context dictates the nature of the inquiry.

• Interactive

Case studies are constructed works. The findings are drawn from the scholarly bias of the
researchers. There must be a constant comparison and contrast between the
communication situation being described and the reflections of the researcher.

56 © UP CMC Department of Communication Research


• Propositions-based

Case study researchers would know that they are doing a case study if there are propositions
that they would like to confirm in the investigation. A proposition is an assertion that must
be proven or disproven by the case. The researcher must be aware that the unit of analysis
could provide the data for the assertion, and eventually, the proposition to be confirmed or
disconfirmed. Of course, for such a proposition to be logically supported or debunked, there
should be a set of criteria that should be used in interpreting the findings.

12.2. Procedures for doing case study

• Start with the research conceptualization

Determine and define the research question. The researcher must be able to carefully
define the research question. As previously mentioned, case study research begins with a
“How” or a “Why” question. In defining the research problem, the researcher must be able
to find time in looking at what is already known about the phenomenon. Fidel (1984) noted
that a researcher must be familiar with the subject matter to be investigated. This requires
the researcher to identify the existing documents and records that would support the study.
Moreover, Rowley (2002) mentioned that the researcher must also ask the following
questions before deciding to venture in the case research:
- What is the existing knowledge about the phenomenon?
- Do I understand the field research procedures?
- What would be the various sources of information that I need, and do I have access to
such sources?
- Do I understand my own case study question(s)?

Always remember your case study goals. To ensure that the case study design is valid, the
researcher must be keen in using tactics that would make the conclusions sound and
justified. For case studies, the aim is not to establish statistical generalization but rather
analytical generalization (Rowley, 2002).

Here are some suggestions on how to arrive at generalizations:

a. Construct validity—Researchers are encouraged to use multiple sources of evidence,


establish chain of evidence, and have key informants review the draft case study.

b. Internal validity—Researchers must use pattern matching, do explanation building, do


cross-case syntheses, and create a time series analysis to constantly check the value of
the data (Yin, 2003).

c. External validity—The use replication logic in multiple case studies and a strong
adherence to protocols must be ensured to arrive at a strong conclusion.

Be mindful of the length of time spent on the research. Time is also a consideration in doing
this type of a study. If researchers embark on a multiple case study, then they must be able
to project a reasonable timeline for data construction and interpretation. Pernia (2004)
wrote that researchers must devote a considerable time to realizing the themes of meanings
from their participants. Researchers must set aside time to make sense of voluminous
transcripts and observation sheets that are typical in a case study.

A Primer on Communication and Media Research 57


• Identify the units of analysis or data

It is important for researchers to know how to identify the qualities of a good case. Here are
some of the qualities that must be present before a researcher delves into a case:

a. Interesting—A communication researcher must be able to identify that the case is


appealing and worthwhile. It must capture the interest of the researcher and the
imagination of the readers.

b. Representative—Chosen cases must be able to reveal the phenomenon under


investigation. The researcher must ensure that the case would be normal, and that the
events in which the phenomenon happens may potentially happen again.

c. Realistic—The case should be from real life situations about real characters. It is
expected that the case depicts actual processes and practices.

d. Objective—The case presents events as factual as possible. The case presents events
and facts as they actually took place and actions from those who are involved in the
phenomenon.

e. Moderately complex—The chosen case should not be so simple as not to warrant a


thorough investigation. The case should be able to compel insights from the researcher.

• Identify the case sites

Determine the site. Case study research involves the collection of data from multiple data
collection techniques. A communication researcher must be able to have the capacity to do
preliminary visits and must be keen in providing a decision if the case site is indeed the right
venue for the study. The researcher must establish a network of resource persons that may
be able to help him or her find a case site that matches the demands of the study.

Make initial contact. Once the site has been selected, the researcher must be able to
anticipate a successful initial contact. Naumes and Naumes (2006) said this part of the
documentation sets the tone for future interactions and determines the capacity of the
researcher to continue the case study effectively. It is recommended to have a contact or a
“go-between” inside the site so that direct relationships could be established.

Gain access to important persons and data. Even if initial understanding between the
researcher and the intermediary has been established, there is still a necessity to establish
connections with those who could provide other equally important information. Gaining
permission to access records and other forms of data is as important as the entry to the
study site. At times, the entry to the study site is the least of the researcher’s concern.

58 © UP CMC Department of Communication Research


• Develop, pretest, and revise instruments.

After identifying the orientation and design of the investigation, it is time to dig into the case.
Unearthing the nature of the case is crucial so the researcher must be able to have a sturdy
set of data collection techniques that would help in getting the right information. Case
studies use a variety of quantitative and qualitative data collection techniques. One may use
surveys, interviews, and document analysis in order to come up with data. Triangulation is
necessary in a mixed-method study because it allows researchers to use multiple sources of
evidences so that they could provide a greater understanding of the particularities of the
case. There should also be a thorough consideration of the use of the theory in the case
study.

The conduct of the study requires a communication researcher to be knowledgeable about


the use of various data collection methods. As a mixed method research, it is important to
use a variety of ways to get data.

Here are some suggestions on how to collect pertinent information from various data
collection techniques:

Interviews

The use of the interview is essential in providing in-depth information about the case. By
identifying key participants, the researcher is led to both knowledge and opinions that may
provide insights regarding the research question. Whether a research is bound to do
exploration, a description or an explanation, interviews yield significant information from an
individual’s perspective. Doing individual interviews elicit perspectives while group
interviews provide shared and co-created ideas and viewpoints.

Since the case study method wants to examine retrospective, snapshot, and diachronic cases,
then it is recommended that the interviews be semi structured. This means that the
researcher has a predetermined set of questions but is open to tentative answers and follow
up questions. It is also more inviting to the interviewees if they could openly define and
discuss the phenomenon from their own perspectives.

Interviewing requires a special set of skills. Open-ended questions usually work best because
they can elicit in-depth information such as definitions, reasons, understanding, and
explanations of events and experiences. It is expected, therefore, that a case study
researcher is equipped with the appropriate skills in doing the interview.

The interview process must happen in a “natural” study site. This helps the informants be
more comfortable with the questions because they are in their own territory. It also allows
the researcher to better understand the site and its relationship with the situation being
studied. It is often the case that informants can remember and recollect information with
the visual cues of the place.

A Primer on Communication and Media Research 59


Interviews are frequently used in case study research because of their functionality. A
researcher must be able to identify what kinds of questions are needed to answer the
research problem. It is essential for a researcher to ask questions per researchable
subproblems after identifying the kinds of questions to ask. It is also important to note that
the researcher should be able to cross-reference interview topics and items to ensure that
no research objective is missed during the actual data gathering. The researcher should also
develop an interview structure so that minimum information could be gathered from each
respondent. Lastly, the researcher must confirm the ethical appropriateness of the questions
that would be asked from the interviewees.

Participant observation

Case study researchers construct the phenomenon along with the insights from the
informants and participants. One way to provide information objectively is through a skilled
process of observation. For this technique to work, the researcher must know what to
observe. A researcher who wants to identify factors that made a handwashing campaign
influential to the health behaviors of a community may decide to observe individuals’
behaviors in their homes while and after the campaign is rolled out. A case study researcher
on technology adoption may ask to observe how farmers use certain materials in the fields.
It is, therefore, important that a case study researcher create an observation guide wherein
all features that must be addressed (seen, heard, tasted, felt) during the process of
observation would be listed. The guide, along with the observation notes made by the
researcher, provides systematic data for review and analysis.

Just like the interview, it is important to gain access to the actual and natural setting of the
participants as this may connote openness and trust from the community. A researcher,
however, must strive to be unobtrusive in the observation. All ethical and legal requirements
regarding research participation must be accomplished before the actual participant
observation.

Spradley (1980) developed a strategic way to do an observation. Even if the work has existed
for four decades now, there is still value to the categories of observable phenomena and the
types of questions to be asked per category. The descriptive categories and questions are as
follows:

Space Object Act Activity


Space Can you describe in What are all the What are all the What are all the
detail all the places? ways space is ways space is ways space is
organized by organized by acts? organized by
objects? activities?
Object Where are objects Can you describe in What are all the What are all the
located? detail all the ways objects are ways objects are
objects? used in acts? used in activities?
Act Where do acts How do acts Can you describe in How are acts a part
occur? incorporate the use detail all the acts? of activities?
of objects?
Activity Where do all the What are the ways What are the ways How are you going
places and activities in which the in which activities to describe in detail
occur? activities incorporate acts? all the activities?
incorporate objects?

60 © UP CMC Department of Communication Research


Document analysis

The case study researcher is also required to review published sources as evidence that
support the initial assumptions in the case. Reviewing existing documents or creating and
administering new ones provide useful information about the nature of the case. Documents
provide narratives that may illustrate trends and other significant outcomes. Documents
may include extracted records from online files, public or private records, physical evidence,
and even the instruments used during the study. The usefulness of such data sources may
provide an in-depth insight about a person’s or an organization’s belief, attitude, and even
practices.

When combined with other forms of data, the evidence from the various documents could
serve as a rich set of collected information from multiple data sources. When a case study
researcher gathers information from documents, it is important to ask the following
questions (Hancock and Algozinne, 2006):
- What sources are available that can be used to provide answers to my research question?
- What types of answers will be available if the document is used?
- How will information be selected from all that is available?
- How will the documents be represented during data analysis?
- What ethical concerns are relevant with regard to documents that will be analyzed?

A researcher must, however, vouch for the accuracy of the documents that were used in the
research. The study may suffer from bias if the author failed to find original or verifiable
pieces and sources. In order to assess the authenticity of a document, Hancock and
Algozinne (2006, citing Clark, 1967) provided some questions that researchers may ask
before considering a document as part of a data set:
- Where has the document been and what is its history?
- How did the document become available?
- What guarantees exist that the document is appropriate, accurate, and timely?
- Is the integrity of the document without concern?

Survey

Previous chapters have discussed the function of surveys in establishing causality. For case
study projects, the value of a survey rests on its capacity to establish “causes of effects.”
Surveys may provide correlated and associated factors that caused an effect through
statistical modeling. In the same breath, case study research looks for the mechanisms in the
factors that allowed the effects to happen.

How much data is needed in a case study? This is a tricky question that is thrown against the
case study research. Some are not comfortable with the small number of units that the
design wants to study. Some scholars say that data, which could be both quantitative and
qualitative, may come from one set or a collection of samples from various sets that
represent the population. Remember, however, that this type of research does not want to
make statistical generalization.

A Primer on Communication and Media Research 61


The answer to the “how many or how much” question would always be dependent on the
nature of the inquiry. If communication researchers seek investigate mobile phone use for
dating purposes, they may look at practices of individuals and provide an analysis of dating
styles. If they want to look into the organizational culture of a government agency, then they
may study one institution with two departments as study sites to establish similarities and
differences in practices. Case studies are used for various phenomena, but they share the
same objective of imbuing a phenomenon with substantive context and approaching it
holistically.

• Code your data

When gathering the necessary information for a case study is complete, a coding guide must
be put in place in order for the researcher to know if the right information has been
collected. If not, then the researcher has go back to the field to collect still missing data.

• Analyze, interpret and report your data

The next section focuses on these three steps—for case study and the other methods.
Continue reading to learn more about these steps.

62 © UP CMC Department of Communication Research


PART 3
Analyzing and Reporting Research in
Communication and Media

A Primer on Communication and Media Research 63


13. THE RESEARCH DISSEMINATION PLAN
by Professor Violeda A. Umali, PhD

In simplest terms, research dissemination can be defined as the act of communicating, or making
known, the research results to other people. There are different people and sectors to whom the
research could be communicated; they include other researchers, government and non-
government agencies, specific sectors of the population (e.g., youth, health workers, media
practitioners, etc.), and the general public. Depending on the nature of the study and the
researcher’s goal, research findings could be communicated to local, national, and/or
international audiences.

Dissemination is primarily associated with the formal, comprehensive written research report,
for example, the thesis or dissertation, a journal article, or a project report submitted to the
funding agency. However, research results are also disseminated in other ways, in written and
non-written forms. Research results could be disseminated in the form of popularized reports or
feature articles released through mass and interactive media channels. Policy briefs and the so-
called “white papers” are other options for communicating research findings in the written form.
Non-written forms of research dissemination include paper presentations in conferences and
colloquia, video presentations, press conferences, and media guesting.

Dissemination is commonly understood to be done at the completion of a research project.


However, dissemination of findings need not wait until the research project is completed.
Indeed, for research projects like an undergraduate thesis, dissemination often happens after the
submission of thesis manuscript. But for graduate theses and dissertations, partial research
findings could already be disseminated, for example, in conferences. In so doing, a researcher
could get feedback and suggestions that could be helpful in further improving the study. Most
research projects, especially those that run for several years and/or have a large scope, release
research findings periodically, while the research is still ongoing.

Dissemination is not an option for, but an obligation of, researchers. The fundamental rationale
for conducting research is to contribute new knowledge for the benefit of society. Research is
not undertaken for its own sake; it is meant to help people gain a better understanding of the
world around them and find better ways of doing things. And the only way that research could
be of help to people is to share research findings with them. We must always remember that
“research is only useful if it can be accessed and understood” (CRU, 2011). Disseminating
research findings is, therefore, an ethical responsibility of researchers.

Planning for research dissemination

Given the scope, scale, and attendant challenges of research dissemination, it is necessary to
have a proper dissemination plan. The research dissemination plan is a document that contains
details about what research findings are to be disseminated and how they are to be
disseminated. The plan enumerates strategies and tasks starting from the data processing stage
of the research and continuing until the public release of the research findings. In a standard
research proposal—say, for a thesis or dissertation—plans for data processing and analysis are
discussed; however, other dissemination concerns, such as how the data are to be shared with
various publics, are not included. In contrast, for large-scale research projects that receive
external funding, a full-blown dissemination plan is often part of the requirements. Regardless of
what are required, it is good practice to prepare a comprehensive dissemination plan for any
research that we undertake.

64 © UP CMC Department of Communication Research


Preparing a comprehensive research dissemination plan yields several benefits for the
researcher:
- First, it can be a source of additional inputs for formulating the study’s data gathering
instruments and for finalizing protocols for implementing the research method/s chosen.
- Second, by consciously identifying the intended end users of the research findings, the
researcher will be able to identify possible ethical and/or legal issues regarding the public
release of findings.
- Third, having a dissemination plan makes it possible for the researcher to estimate the cost
and other resource requirements of implementing the different dissemination activities.
- Finally, a research dissemination plan, particularly the activities pertaining to the public
release of the research findings, will help the researcher prepare for the work that lies ahead
after the study itself has been completed. When a researcher decides to engage in public
dissemination of her/his research findings, s/he should realize that there is more work to
follow after the full research report (e.g., the thesis or dissertation) has been written and
submitted. The scale of the work to be done depends on what dissemination venues the
researcher plans to tap.

Research dissemination planning entails that you undertake the following:


- Identify your objectives for the public dissemination of your research findings.
- Identify the venue of the public dissemination of your research findings.
- Identify the audience/s of your public dissemination.
- Identify the research findings to be shared.
- Identify the research dissemination material and/or format that fits each of your
dissemination objectives.
- Identify your dissemination partners.
- Specify the timeline (activities and corresponding schedules) for each research dissemination
objective.
- Determine the resources required for each research dissemination objective.
- Assess your capability to acquire the resources needed for your research dissemination
undertakings.
- Revise your research dissemination objectives if needed.

A Primer on Communication and Media Research 65


14. QUALITATIVE DATA ANALYSIS AND INTERPRETATION
by Associate Professor Julienne Thesa Y. Baldo-Cubelo, PhD, Assistant Professor Ma. Aurora
Lolita L. Lomibao, MA, & Assistant Professor Randy Jay C. Solis, PhD

14.1. Overview

14.1.1. Purpose of analysis in qualitative research

When you are finished with data collection, you are now ready to bring everything that you have
gathered, bring them together, and try to make coherent sense of everything. This is what we
call qualitative data analysis. This refers to the processes and procedures that researchers utilize
to organize, identify, and examine their data and provide some level of explanation,
understanding, or interpretation.

There is both good news and bad news for qualitative researchers: there is no one universal way
to analyze qualitative data. This is good news because it allows researchers to be free and
creative in interpreting their data, with or without the involvement of their research participants.
But this can also be bad news, because an idle researcher can make misguided conclusions based
on faulty interpretation of data! Because it is not guided by universal rules, qualitative analysis
can be a very fluid and continuous process. It is highly dependent on the researcher and the
context of the study.

For qualitative researchers, the process of data collection and data analysis is not linear. This
means that, for some researchers, data analysis comes after all the information has been
gathered. But for others, analysis can occur simultaneously with data collection, or as the
research progresses. Thus, a researcher’s analysis can change during the course of the study, and
as the data emerges.
There are two important things to remember when you are doing qualitative data analysis: first,
you must achieve meaning and understanding from the data, and second, you must determine
how your analysis helps to answer your research questions, or to draw conclusions.

Qualitative data analysis is an iterative and reflexive process that begins even as the data is being
collected, rather than after data collection is finished. For instance, in an ethnographic study,
researchers can note their ideas about the meanings of the text, next to their field notes or
interview transcripts. They can also make initial guesses about how these might relate to other
issues. This process of reading through the data and interpreting them continues throughout the
project. Researchers who are conducting a textual analysis can also start interpreting their data
even while the study is still in progress, making marginal notations to their coding sheets as they
make new observations or patterns.

Why should qualitative research proceed in this manner? Why can you not wait until data
gathering is over before you start with analysis? Well, you cannot delay the analysis, because an
iterative (or repetitive) approach and emerging design are at the heart of qualitative research.
This involves a process whereby researchers move back and forth between sampling, data
collection, and data analysis, to accumulate rich data and interesting findings. The principle is
that what emerges from data analysis will shape subsequent sampling decisions. Immediately
after the very first observation, interview, or focus group discussion, you have to start the
analysis and prepare your field notes.

66 © UP CMC Department of Communication Research


The analysis of qualitative data can thus be treated as a careful reading exercise. This means
looking closely at the information collected, reading it through, and assigning sections to codes
or themes as you proceed. This is the first of many “readings” that qualitative data should go
through, because analysis of qualitative data is an iterative, cyclical process. Going through your
data again and again can enable you to challenge your coding strategy, keep you on the alert for
new meanings emanating from your data at each reading, and seeing patterns in new and
different ways, and locate gaps in the data collected.

Reading data in qualitative research generally begins with organizing data. Large amounts of data
need to be classified into smaller and manageable units, making them easier to retrieve and
review. Reading the data enables you to have a sense of the whole, by looking at themes,
patterns, and the unique, while not losing sight of the overall picture. It means immersing
yourself in the data. To stress the importance of closely reading or examining the data you have
collected, you can:
- Make as many labels or codes as needed
- From these, you can make a coding sheet, in which you collect the labels and, based on your
interpretation, cluster them into preliminary categories
- The next step is to order similar or dissimilar categories into broader higher order categories.
Each category is named using content-characteristic words
- Then, you use abstraction by formulating a general description of the phenomenon under
study: subcategories with similar events and information are grouped together as categories,
and categories are grouped as main categories
- During the reading process, you can also identify ‘missing analytical information’ and
continue data collection

14.1.2. The nature of data

Qualitative approaches show the realities behind “the numbers,” and provide rich descriptions
and interpretations of events, phenomena, people, communities, cultures, and rituals. This
implies that a text or an experience can have multiple and varying meanings, and that these
meanings cannot be judged as “empirically” true or false. A researcher can only provide one
possible interpretation among many. Other researchers, with different backgrounds, or at
different contexts, can come to very different conclusions, while using the same set of data. This
is why researchers themselves are considered the “tool” in qualitative research. This makes
qualitative analysis both challenging and rigorous, but also creative, original, and fun!

There are two important terms here: the emic and the etic approaches to data gathering. The
emic approach refers to perspectives obtained from within the community, culture or social
group being studied (or from the perspective of the subjects), while the etic approach refers to
perspectives from outside (or from the perspective of the observer).

In qualitative data analysis, the emic and etic approaches also provide useful starting points for
researchers because in qualitative data analysis, each researcher makes sense of the findings in a
personal way.

The emic perspective typically means approaching the data using the internal language and
meanings of a particular culture. While both the emic and etic perspectives are employed in
qualitative research, the emic approach is perceived by a number of scholars as being more
relevant in the interpretation of a culture and in the understanding of cultural experiences within
a particular group. The reason for this is that it is impossible to truly comprehend and appreciate
the nuances of a particular community or group of subjects unless one resides, or is part of, that
culture.

A Primer on Communication and Media Research 67


In contrast to its counterpart, the etic perspective encompasses an external view on a culture,
language, meaning associations, and real-world events. Most often, in qualitative research, the
etic perspective is associated with that of the researcher since it comprises the structures and
criteria developed outside the culture as a framework for studying the culture. When researchers
take an etic approach to their study, they use preexisting theories, hypotheses, and perspectives
as constructs to see if they apply to an alternate setting or culture. The use of an etic perspective
or approach to research is beneficial as it enables comparisons to be made across multiple
cultures and populations, which can differ contextually. This comparison of differing cultures and
populations enables researchers to develop broader cross-cultural themes and concepts. An etic
(outsider's) perspective can never fully capture what it really means to be part of the culture.
Related to the concepts of emic and etic in qualitative data analysis are the inductive and
deductive approaches.

• Inductive Approach

The inductive approach is not based on a structured or predetermined framework. This is a


thorough and time-consuming approach to qualitative data analysis. This approach is often
used when the researchers know very little of the research phenomenon they are studying.
The researcher identifies important categories in the data, as well as patterns and
relationships, through what Schutt calls “a process of discovery” (2009, p. 358). There are
often no predefined measures or hypotheses. Ethnographic researchers call this an emic
focus, which means representing the setting in terms of the participants, rather than an etic
focus, in which the setting and its participants are represented in terms that the researcher
brings to the study.

• Deductive Approach

The deductive approach to qualitative data analysis involves analyzing data based on a
structure predetermined by the researcher. In this case, you can use your research questions
as a guide for grouping and analyzing your data. This is a quick and easy approach to
qualitative data analysis and can be used when you, as a researcher, have an idea of the
likely responses from your participants.

14.1.3. Source methods in qualitative data analysis

Qualitative data analysis is utilized for methods such as textual analysis, ethnography, and case
studies. We will focus our discussion on the analysis of data from textual analysis and
ethnography. But note that other methods embrace qualitative data analysis, such as grounded
theory, narrative inquiry, phenomenology, and even indigenous methods such as Sikolohiyang
Pilipino.

• Textual analysis

Generally, textual analysis is a way for researchers to gather information about how other
human beings make sense of the world, as well as understand the ways in which members of
various cultures and subcultures make sense of who they are. In communication and media
studies, we study texts such as films, television programs, magazines, advertisements,
clothes, graffiti, and so on) in order to try and obtain a sense of the ways in which, in
particular cultures at particular times, people interpret reality. We also treat interview
transcripts, journals, recorded observations, or existing documents as texts. All these texts
combine to form meanings, and these meanings must be sorted and considered for
conclusions to be reached.

68 © UP CMC Department of Communication Research


• Ethnography

In essence, ethnography is distinguished by its focus on the culture of a group or society,


through immersion in that culture, to study everyday lives. Data collection is mainly done
through participant observation, among other methods. In the field, the researcher takes
copious amounts of field notes. These notes often form the backbone in the analysis of
ethnographic data. Field notes can take many forms, such as detailed observations and
general interpretations, reflections, summaries of recorder interviews, even sights, scents,
and sounds. All these multiple data sources and data collection methods require
triangulation, which is a type of qualitative cross-checking or collaboration procedure. In
ethnography, all your data are expected to agree, or converge, to support a conclusion. If
the multiple sources of data are in agreement, the findings are believed to be more credible.
Triangulation greatly enhances the validity of qualitative findings.

• Case study

The case study provides an interesting counterpoint to textual analysis and ethnography,
because it can employ mixed methods to describe and understand phenomenon. But if you
are looking to study communication or media phenomena, organizations, or processes in a
very holistic manner, then the case study should prove to be an appropriate method for you.
Researches using the case study method inquire into their topics using the accounts of
people, the examination of documents, and the use of the researcher’s personal
observations.

14.1.4. Key principles in qualitative data analysis

To research students, analyzing qualitative data can look confusing because it “looks” quite
unstructured. However, data analysis, in whatever form, can be carried out in an organized and
disciplined manner if students remain focused on their research problem and methodology.
Remember also that the validity of your research rests heavily on your data analysis.

Schutt’s Investigating the Social World (2009) offers valuable insights to students who want to
use qualitative techniques in data collection and analysis. The major points are:
- A focus on meanings, cultures, interpretations, specific situations, and behaviors
- The collection of many data on a few cases, rather than few data on many cases
- Study in depth and detail, without predetermined categories or directions, rather than
emphasis on analyses and categories determined in advance
- A conception of the researcher as an “instrument,” rather than as the designer of objective
instruments, to measure particular variables
- Sensitivity to context, rather than in seeking universal generalizations
- Attention to the impact of the researchers’ and others’ values on the course of the analysis
rather than presuming the possibility of value-free inquiry, and
- Rich description of the world rather than the measurement of specific variables.

14.1.5. Key considerations in qualitative data analysis

The challenges of qualitative research analysis are many and varied. Researchers are responsible
in ensuring their findings, and their interpretation of these findings, pass through a meticulous
and intensive process of analysis, so that the study will be able to completely answer its research
question. Can you trust the findings of a qualitative study? This is where the concepts of validity
and reliability come in.

A Primer on Communication and Media Research 69


Qualitative research cannot escape from subjectivity: this means that researchers cannot
separate themselves—their opinions, feelings, personal histories, identities, and contexts—from
the research. As Croucher and Cronn-Mills (2015) aptly put it, qualitative research proceeds from
the interpretive paradigm, which “focuses on the belief that reality is constructed through
subjective perceptions and interpretations of reality.” Thus, it is common practice for qualitative
researchers to actively participate in the research process, in the sense that the researchers
themselves are inherent, or can be observed, in the analysis and writing of the research.

• Trustworthiness

Just because qualitative research places values above neutrality and believes that research
can never be value-free, it is not exempted from the rigors of research. People who
encounter your research should find it credible and believable, and not incredible or
doubtful. Therefore, your research must be trustworthy. You can ensure that your study is
trustworthy through some strategies. Some of these are:
- Data triangulation—using multiple sources of data
- Prolonged engagement in the community or study setting
- Member checking—consulting with study participants on the accuracy and validity of
the data and the study findings, although for obvious reasons, this cannot be done in
artifact analysis.

There are some other ways to guarantee that your research analysis is thorough, meticulous,
and careful This ensures your certainty as regards your interpretations and conclusions.

• Auditing

The term “auditing” refers to the systematic review of the processes involved in the
researcher’s decisions or actions made in the course of the research. This is usually done
a) to ensure that the research conforms with accepted standards of quality in research or
b) to validate the accuracy of the results.

In qualitative research, auditing can be a valuable means to a) demonstrate the rigor of


research, b) answer questions regarding the researchers’ neutrality, and c) support the
credibility and trustworthiness of their findings and interpretations.

It is important that plans for an audit be addressed early in the design of a project so that
the process can be incorporated in the manner that is most appropriate to each study. If you
feel that your research needs an Auditor, you should already indicate this in your study
proposal. In this case, researchers must address some concerns, such as: Who will serve as
the auditor? Who are experts who can potentially act as Auditor? When should the auditing
process begin? How often should auditing take place? What aspects of the study should
undergo auditing?

The actual process of auditing can be initiated at any point in a study. It may also be
conducted near the conclusion of the study. Engaging auditors early in the process enables
them to provide valuable monitoring throughout various phases of the research. Whatever
involvement the auditor may have in the research project, the researchers must already
have planned for this in the early stages of the study. Auditing of a qualitative study involves
oversight and, at a minimum, review of the conduct and/or the conclusions of the study.

70 © UP CMC Department of Communication Research


There are two general types of research audits:

a. Internal Audit—In an internal audit, the members of the research team themselves
provide a system of checks and balances for each other. This guarantees consistency in
the research process and can serve to identify, and subsequently decrease, the bias of
any specific team members involved in the research. An internal audit can involve an
exchange of documentation for review by other members of the team who can examine
decisions and analytic processes associated with the research. The internal auditor can
also be your thesis adviser.

b. External Audit—The external audit is a more formal and systematic process, in which
the researchers seek the assistance of a person or people who are not connected, and
with no vested interest in, the research. An external auditor is usually is a researcher
who is knowledgeable in the processes of qualitative research. The researchers then
present and defend all their decisions to the external auditor. The auditor can also
review raw data, notes, logs, journals, and other materials associated with the study.

• Authenticity and fairness

Authenticity in research requires students to reassure the academic community that the
study was conducted, and the data analyzed, in a genuine and credible manner. Authenticity
entails that any findings from the research conducted has been examined, both from the
perspectives of the lived experiences of the participants, but also in terms of its wider social,
political, and economic significance. What has been the impact of the research on the
communities and the people that have been studied? How will it be relevant to similar
communities and peoples? Authenticity, then, is seen as an important facet of establishing
trustworthiness in qualitative research, so that it may be of some benefit to the wider
society.

Fairness means that qualitative researchers need to ensure that participants have equal
access to the research inquiry. This is intended to avoid bias on the part of the researcher,
and enabling the study’s participants to become part, not only of the data gathering process,
but also in the analysis and interpretation of data. In this way, no one is marginalized—all
participants’ voices, views, concerns and perspectives, are represented in the research
process.

• Confirmability

In qualitative research, the actions and perceptions of participants are analyzed for their
expressions of meaning within a given context. Consistent with the practices of the selected
qualitative methodology used, the researcher then interprets the participant expressions
through a coding or meaning-making process. In this coding process, the researcher is
looking for messages that are consistent with, confirm, or expand on current knowledge and
theory. From these insights, the researcher is then able to make statements about the
context under study.

A Primer on Communication and Media Research 71


In so doing, additional processes must be incorporated into the research design that verifies
the truthfulness or meaning being asserted in the study. This is called confirmability.
Confirmability is often equated with reliability and objectivity in quantitative research.
Reliability and objectivity are measures of the accuracy of the truth or meaning being
expressed in the study. Confirming, or verifying, the findings of a research project is
important, so that the researchers can show that the study they have conducted is
important, and not just one specific project. This is an essential part of any academic
endeavor: that the research moves beyond a one-time task, and become part of an attempt
to build on, expand, or create theory.

Confirmability is an accurate means through which to verify the two basic goals of qualitative
research: a) to understand a phenomenon from the perspective of the research participants
and b) to understand the meanings people give to their experiences. Confirmability is
concerned with providing evidence that the researcher’s interpretations of participants’
constructions are rooted in the participants’ constructions, and also that data analysis and
the resulting findings and conclusions can be verified as reflective of, and grounded in, the
participants’ perceptions.

In essence, confirmability can be expressed as the degree to which the results of the study
are based on the research purpose and not altered due to researcher bias. Although
confirmability does not deny that each researcher will bring a unique perspective to the
study, it requires that the researcher account for any biases by being transparent and open
about them and use the appropriate qualitative methodological practices to respond to
those biases.

Ensuring trustworthiness, auditing, authenticity, fairness, and confirmability are key


requirements for researchers to ensure rigor in their analysis of their data. Although these
are time-consuming, their use enhances the study’s credibility and the researchers’ integrity.

14.2. The process of qualitative data analysis

Qualitative analysis is a non-linear process. It is no wonder, therefore, that textbooks on the


topic offer different illustrations on how it is carried out. The process described here showcases
the non-linearity of analysis in qualitative research. However non-linear the process maybe,
almost all textbooks will have at least three major parts that comprise qualitative analysis,
namely: a) Data management or data reduction, b) Description, and c) Interpretation.

There may be variations in terms and scope in each of these three major components but what is
common among qualitative research textbooks is the back-and-forth provision of a typical
qualitative process.

In going through the following steps, do note that no one qualitative research realistically uses all
of them. Likewise, please be reminded that some steps do not come after each other, but rather,
are simply components that supplement one another. This means that there are sections here
that can be skipped. They are simply stated to give you more tools as you see fit in your specific
qualitative research projects.

72 © UP CMC Department of Communication Research


14.2.1. Data management through data reduction

Qualitative data reduction literally means the reduction of the information from qualitative data
records or data sets (such as interview transcripts, fieldnotes, textual coding, sheets of photos
and videos, data logs, journals, audio-visual materials, etc.) into more manageable information
that can be scanned faster and reviewed easier in order for this information to be processed into
insights and interpretations. Since qualitative data are highly complex, the main goal of a
qualitative researcher is to be able to churn basic observation, information, and notes into more
manageable descriptions and then goad the data into interpretations. Almost all qualitative
researchers agree that data reduction is already a form of analysis.

The next six subsections under Data Management explain the sub-steps in applying Data
Management through data reduction. Not all of these sub-steps need to be applied though to all
qualitative analyses. These are just shown here to give you options in managing your qualitative
data.

• Familiarization and organization of data sets

Data management through data reduction requires a continuous decision-making that can
only be carried from a thorough familiarization of data. Some qualitative researchers call this
part of analysis pre-coding or data cleaning. Basically, the researcher takes stock if his/her
data are complete and comprehensive. One also typically conducts an inventory of the
variations of data records or data sets one has—from transcripts, FGD “clean notes,” or
photo logs, diaries, etc. This is an important preparatory stage in qualitative research and is
often called a prelude to analysis because of the subjectivity of decision-making one is
expected to take.

This step may happen all throughout the data collection or data generation stage. As
researchers “clean” the data, they also give feedback to how much more data generation is
needed. Should they still need to go back to the field? Is there a need to watch that video
clip again? Do they need to send that follow-up question via email since it was not
particularly expounded on in the last interview? However, it is important that the analysis
part should be observed in the research design timeline if the researchers’ goal is to finish.

• Data lay-outing as a pre-coding step

There are several ways to set-up your data for analysis. Think of this process as preparing
your table for baking or your laboratory equipment for experimentation. It is, therefore,
important that researchers use a method that fits their disposition. Being systematic is
important in research but for qualitative research, “systematic” does not have one look. Just
think of this step as assessing how your ingredients for a recipe should be arranged to
facilitate easy cooking. Since you are just starting, it is helpful to stick to these steps to get
you started.

To emphasize, there are no strict rules in how the researcher should do this initial data lay-
out. What must be considered is the space for the notetaking, coding, or commenting. A
certain numeric system or alphabetization may also be utilized depending on the researchers’
level of comfort with such a structure. In most cases, this initial lay-out can efficiently
transition to the use of qualitative computer software as will be explained later. If software
are not utilized, however, this exercise of arranging data simply sets the disciplined
requirement qualitative analysis calls for.

A Primer on Communication and Media Research 73


• Identifying initial themes by making an index

Qualitative research textbooks have different stances with regard to the need for making an
initial index of themes (Denzin, 2002; Keyton, 2010; Saldaña, 2016). However, it is important
to lay out this basic step for those who would opt for a more structured qualitative analytical
procedure.

The initial index of themes is identified in order to establish the scope of the data set vis-à-
vis the research’s objectives. The index of themes is mainly guided by the research questions
and general impressions of data. This index may be constructed as the data generation
happens and can be polished at this point. Although the index may look final, it is provisional.

How is this done? The analyst may go back to some defined concepts in the concept-
indicators matrix and see which of these are “felt” during the data gathering period.
Indicators are the analyst’s educated guesses of what might be observed in the field, but
they are never final or absolute.

The next strategy of coming up with the initial thematic index is simply by making marginal
notes on your data records. An index is not something one comments on, but something one
consults every now and then.

Depending on the researcher’s preference, a thematic index may be rugged (a simple outline
of initial themes) or structured (a more detailed outline of initial themes).

• Initial tagging of concepts or Axial Coding

Axial Coding has two meanings in different textbooks on qualitative analysis (Richards, 2005;
Ritchie & Lewis, eds., 2003; Saldaña, 2016). The first one is a type of coding using a priori (i.e.,
already-existing; already-determined) of terms that are usually generated from the study
framework’s concepts and indicators. It refers to finding the axis or intersections among
three to four codes. For example, the axis for the codes “managing student org activities,”
“coping with sem-ender acads,” and “relaxing with family” may have an axial code “students’
balancing acts.”

In this Primer, however, Axial Coding refers mostly to the previous definition. This type of
initial coding can be compared to sorting out toys into labeled boxes: the dinosaur toys go to
the box labeled “Dinosaurs,” the dolls go to “Dolls,” and so on and so forth. In this case, the
labels are numbers or sub-numbers in the thematic index. The act of labeling is directly
working on the data lay-out. If the data set is an interview transcript, it may look like the
example below. For purposes of consistency, the term tagging, labeling and coding will be
interchanged in this resource material.

Saldaña (2016) terms this initial tagging as In Vivo Coding or Verbatim Coding. This means
that categories are more indigenous to the actual utterances of informants or “texts.” The
initial categories in the index may sound more informal or more “spoken” colloquial, current,
or organic, rather than academic. If In Vivo Coding is utilized, the thematic indexing can
come after it.

74 © UP CMC Department of Communication Research


• Sorting the data through Cluster Coding or “Walling”

This analytical method is the opposite of Axial Coding in terms of coming up with initial
themes. This part is best conducted not as the logical next step but as a complementary step
to axial coding.

The procedure is working closely with verbatim quotes (in the case of transcripts and
fieldnotes), photos, and other “moveable” texts. The researcher clusters these “texts” not
according to how they have been indexed in the initial thematic index. The analyst is tasked
to set aside the labels that were created beforehand and take on fresh new ways of seeing.
Other textbooks term this part as Intuitive Clustering as it clusters “texts” into how they
seem to belong together. The “naming” of each cluster comes after. Here, the act of
plastering them into walls is most effective in group settings. Groupmates can look at each
piece of “text” repeatedly as the commenting and the brainstorming happen.

The moving of the post-its or pieces of papers is more tangible compared to working on
individual laptops. The thinking-out-loud practice among groupmates is also very effective in
facilitating analysis as compared to working in their respective rooms or spaces. This method
values the literal act of coming together and “working with the wall” as an effective tool for
arriving at group sentiment through think-out-loud discussions and even debates. Likewise,
“Walling” may also be used simultaneously with Axial Coding as a method of delegating work
among groupmates. For example, two groupmates do the Axial Coding while the other
applies Walling. This way, the coding methods complement each other. This “wall” may
again be visited during the descriptive process and in the interpretive process.

The use of clustered post-its is a demonstration of typical example of “Walling.” Again, by


working with the wall, the researcher(s) can literally step back, pace around, and brainstorm
with other people (e.g., research groupmates). The tangibility of the material being
manipulated or moved (i.e., taking it off a cluster and pasting it to another cluster) has
shown to be a good way to process qualitative data in a rather “felt” and “lived” way.

• Thematic charting

Thematic charting is a practical yet lengthy process of synthesizing or summarizing the data
and accounting for all data. The chart also serves as one of the analytical charts the analysts
can consult for further evidence of cross-sectionality (i.e., that categories and sub-categories
were indeed observed in several cases). Although in summarized form, the thematic charting
retains the context and the essence of the point of the informant. It should retain the voice
or the language of the informant, (i.e., “text” or context). The general principle is to include
enough data and context without the analyst having to go back to the raw data set. However,
it should also not be crowded as to render the chart as undigested material.

Ideally, thematic charting should be numbered according to the original and expanded index.
At this point in the analysis, the chart now serves as the analyst’s “window” to the data sets.
There is not much interpretation done here, just a meticulous accounting for all that have
been collected. The main tone of a thematic chart is crisp, bulleted, and direct.

But first, let us clarify how “description” may be utilized in qualitative research. The word
may be taken to mean two things:
a. Description as an output in the final write-up consisting of a major section, and
b. Description as analytical procedure—The latter substantiates the former, but they occur
at different parts of the qualitative research.

A Primer on Communication and Media Research 75


14.2.2. Description as analysis: Analyzing for content

Description as the first cycle of analysis is the process of defining dimensions and elements,
refining categories, and further classifying data. It is mapping the range of diversity of each case
or phenomenon. Again, by describing something further, one is simply showing that there is
more to something than what meets the eye. The analyst is always sensing that there are more
layers of complexity in the data. By clustering data into labels and initial categories, the data are
made simple and at-a-glance manageable or “chewable.” By describing them further, the analyst
balances analysis between simplification and complication.

• Developing a Descriptive Chart

A descriptive chart, like the charts above, is one of the first tools used in this part of analysis.
The chart organizes the nucleus of qualitative evidence both as analytical procedure but also
as an aid in the writing part of the research. This portion sensitively reviews and captures
“extracted data” from the “wall” or the clustered data from the previous phase of analysis.
These “extracted data” then form what other textbooks term as “evidentiary warrant”
supporting interpretation.

At this point the analyst can consult the “wall” used in Cluster Coding for the quotations for
review of important lines. Labels or codes in the thematic chart will be further scrutinized.
Some clusters of codes can now fall under one big classification or theme. The themes at this
point are now called descriptive themes which are different from the initial themes
identified during the data management stage.

It is very common for 60 initial codes to fall under five to eight descriptive themes. However,
do not be tied down with these numbers. Any number of descriptive themes will do in this
stage of analysis. It is better to describe in written form several themes rather than
describing many codes. However, the descriptions in the written output are expected to be
extracted-data-heavy to fully account for qualitative evidence (more on this in Writing and
Presenting Research).

Describing as analysis obviously often results in the creation of typologies. However,


typologies are not always required by all research designs. Not all categorizations are
typologies. Typologies are specific forms of classification that help describe and explain the
segmentation of the social world or the way the phenomenon can be characterized or
differentiated through manageable “names.” Categorizations may be long descriptions that
do not appear like names but rather as captions of categories.

Depending on which direction the analyst might take, descriptive themes can still be
chunked into one bigger descriptive theme, or they may retain these as they are. Most of the
time, the description part of an analysis is set out in the first objectives or objectives of the
study. Therefore, the descriptive themes at the last column may already constitute as the
main findings for these objectives.

76 © UP CMC Department of Communication Research


At this point, the act of describing may still concentrate on actions or phenomena that are in
gerund form (i.e., a word derived from a verb but is taken as noun) or ending in ing.
Although it may not always be in gerund or ing form, it is important to note that one can
only describe something that is literally observable by the senses. The theme “chasing time,”
for instance, is much more “of this world” compared to the description “time flying fast.”
Therefore, even non-human subject like “deadlines” can be made more tangible when the
gerund form of the verb is utilized. Later in the interpretive part of analysis, these
descriptions can be translated into more abstract concepts.

• Other Types of First Cycle Coding Methods

This section briefly discusses some coding methods that are used by different qualitative
researches. Each coding method is identified with the best type of data to which it can be
applied. The following are examples of first cycle coding methods: Structural Coding, Process
Coding, and Versus Coding.

- Structural Coding—This type of coding is the most utilitarian and the quickest way to
start off analysis. The “structure” is the arrangement of questions or guide statements
in the qualitative instrument. The method is putting all answers or observations under
each sentence/question to ensure that what is aimed to be answered has in fact been
answered.

- Process Coding—This type of coding is applicable to almost all kinds of qualitative


analysis. It focuses on how things move from point to point, as in the act of doing
something, or arriving at something. This is also the most common default coding in
arriving at descriptive themes since here the gerund and/or the ing form of the verb is
used. For instance, for the concept of “online ambiguity,” the “appearance” of this is
best coded as the actual things people do or say, or how websites, news-reports, or
blogs manifest this, rather than what they actually have “inside.” What is inside is
difficult to uncover, but this kind of coding trusts that what is inside can be observed in
what action being taken on the outside. Think of it as a label we usually use for people
such as “kill joy,” then translate this into more observable patterns of behavior like
“avoiding spontaneous activities” or “not being able to see the humor in jokes.”

- Versus Coding—This kind of coding focuses on how two things are in direct conflict with
each other. Users of this code see the moiety (in French, means, “the other half”) of a
concept, phenomenon, status, etc. that is at the other end of a spectrum. This type of
coding is best used in policy studies in communication, gender studies, or even
discourse analysis. An example would be “Adapting versus Dodging.”

These are just some of the coding methods that are used in communication and media
studies. Across literature, you may find many coding techniques that can suit your particular
type of study. Some of these are Affective Coding (on emotions), Values Coding (on values),
Dramaturgical Coding (on life as a performance), Verbal Exchange Coding (on verbal
exchanges), and many others.

A Primer on Communication and Media Research 77


• Microanalysis tools for description

Microanalysis is termed in this Primer as a tool since it simply aids in the analysis. It can also
be applied in any kind of coding. Micro analysis is staying close to the data in the process of
describing it or interpreting it further. It is going through data in a rather close manner, going
through it line by line. Please note though that not all data require micro analysis. Discretion
is a must. There are simply some lines, some pictures, or some scenes from a video clip, for
instance, that cannot be described immediately. Others may still “speak” to the researcher
even after initial description and therefore merit further micro-analysis.

Most analysts would set-aside a micro-analysis time as the perfect transition to


interpretation. This is often done after certain passages or clips are tagged as still either
mysterious (i.e., not yet speaking to the analyst) or just plain talkative (i.e., still saying
something even after being described).

Please note that these tools for microanalysis may be done simultaneously, depending on
the researcher’s main goal. Understand that not all of these tools may be applied to every
qualitative analysis.

- Questioning—To question a heavily laden line (or lines) is to be captured by some of its
key words. These nodes of data might still appear to be something else aside from their
initial assessment. Often, these lines are multi-layered, idiomatic, or indirect, and thus,
can have several meanings. Likewise, the questioning tool reflects a certain level of
short-coming in the data gathering procedure (i.e., there was no follow-up question) or
simply a lack in the instrumentation. Sometimes, the analyst questions the data because
this datum triggered the question while the other lines simply did not. What usually
would happen is for the analyst to review the thematic chart or the descriptive chart to
check if this question is somehow answered. The questioning may also confer if other
nodes of data demonstrate what is now being surfaced by the text. The questioning will
also inform if this specific observation is unique or an isolated instance.

- Comparing—Since qualitative analysis is extensively about “naming” something as


important or present, the ability to distinguish one instance from another is an
important skill. It can be deemed that the whole of analysis is an act of comparing. Is
this action “Responsibility”? Or is it “Accountability”? Is what this informant saying
about grief similar to what the other informant is saying about loss? Is this picture
showing me “anxious laughter” or is it about “excited laughter”? Is tweet No. 45 similar
to tweet No. 34 on the way they label the government as “necessary demon,” or are
they saying two different things altogether?

- Flip-flopping—This tool advocates for the “other voice” or what is not being said. It can
be likened to defending somebody who is not present in the room. For example, an
initial description of “obliging monk” used for an obedient dormer can be flip-flopped:
“Are there dormers who are neither obliging monks nor sneaky?” How did the dorm
authority figures (e.g., the guards, the resident assistants, the Residence Hall Managers)
experience these two kinds of dormers? If they could label dormers, how would they
categorize them? How come the other informants did not see dormitory life this way?

78 © UP CMC Department of Communication Research


- Red-tagging—This tool is marking the tone of absolutism, purism or fundamentalism in
the data. These tones are tones because they are often both between-the-lines and very
direct too. Often, they carry the voice of finality, as if the speaker is certain that this is
how everybody sees it. The red-tagging is done on the data to see if it warrants further
review in other parts of the data records, charts, or index. In qualitative research, these
fundamentalist or purist tones reveal so much of the characteristics of people, events,
and phenomena.

14.2.3. Interpretation as analysis: Analyzing for context

“Context” in this regard is the set of circumstances to which individuals, phenomena,


organizations, groups, etc. in the material world or “inside texts’ (in movie story lines, for
instance) respond by means of action/interaction/emotions to the rest of the environment.
“Context” ranges from macro to micro. When the researcher analyzes for context, he or she
looks for the story behind the direct story narrated by the individual or, in the case of other
forms of texts, the immediate backdrop of cultural artifacts.

Some qualitative textbooks refer to this part as “explanatory account” or “extrinsic explanations”
as it answers the “how’s” and “why’s” of the study. This is also the part that answers the
question “what is causing this phenomenon to occur?” but not through statistical computations
of causality but through explanatory accounts. Answers to such questions can be found from the
informants’ accounts themselves, from the latent and manifest meanings of “texts,” or from the
analyst’s careful interpretation of the data as guided by the study’s framework. These
explanations are carefully guided by the reviewed literature as well. Depending on the angle the
study is taking off or the direction of the study where analyst wants to take, interpretations can
be a) dispositional (derived from behavior and intentions of individuals) or b)situational (derived
from the larger context or structure).

Qualitative analysis is a bridge between the data which can be likened to an alien species trying
to communicate with earthlings and the readers of the research project. The role of analysts is
not only to goad this alien to speak to them and therefore being able to at least describe to the
reader what it is trying to communicate, but also to read between the lines what this alien is
saying. This expected role of the analyst obviously cannot be fulfilled in one sitting or overnight.
A good deal of focus on the data is needed for this “reading between the lines” to be revelatory
first to the analyst and then eventually to the readers.

Interpretation reflects the analyst’s subjectivity, positionality, and context. The term reflexivity,
an important tenet in qualitative research, in fact, is the disclosed aspects of where the
researcher’s subjectivity is rooted in. Saldaña (2016), however, expands the term “reflection” to
“refraction” to mean that like some people’s eyes with corneas having different levels of
thickness in the surface, one’s reading is a refraction of sorts–considering the convex and the
concave in one’s customized analytical lenses.

A Primer on Communication and Media Research 79


Having mentioned all these, you would probably now feel intimidated by this act of
interpretation as you go into the second cycle of analysis. Here below are some tools that would
guide you in the actual conduct of interpreting. To make this concept more tangible and “of this
world,” think of it as “chewing the data” further to extract their essence. The description part
was similar to describing the food as being salty, sweet or crunchy. Here now in the
interpretation part, on the other hand, the chewing is hoped to bring out more nuanced
characteristics of the food—“rancid, sharp, acidic, with a hint of Mediterranean and the Italian.”
Food can even be viewed as a mirror of cultures, family histories and markers of milestones. In
interpretation, therefore, the unit of analysis is seen more from a meso or macro perspectives
(i.e., bigger picture).

• Level 1 interpretation using Diagramming

Qualitative analysis is not only about enumerating findings in the form of themes, categories,
or typologies. One is expected to enumerate findings to simplify thick descriptions, but this is
not all there is to analysis. Equally important to coding as the basic tool for description and
interpretation is diagramming or data displaying. This tool is utilized so that analysis
becomes not just an exercise of categorization but also of linking concepts. By linking
“smaller concept” to “bigger concepts,” the qualitative data can be visualized less in a linear
manner. With a diagram, the interplay among categories and subcategories can be
established. Therefore, although the written form can appear to be flat, the written text in
the research output can offer a more complex analysis of how things are placed under the
larger collective world of the data or findings.

• Level 2 interpretation: Linking to literature and theory

The most popular tool for either starting out in interpretation or deepening it is Memoing.
Since this part of the analysis requires that the researcher links initial findings to literature
and theory, it helps to discipline one’s self in “writing out loud” to give form to musings that
may easily be forgotten. Memoing is best done by crafting complete paragraphs that
expound on findings. It is also utilized to discipline the researches to produce inferences in
written form. One infers on the described data by churning up one’s interpretations of them
and relating the findings to literature and the study’s framework. Aside from Memoing,
other tools are the Trinity Test (checking how the findings reflect the micro, the meso, and
the macro realities implicated by the “text”) and the Touch Test (translating the tangible
descriptions – the ones that can be ‘touched’ -- of texts into abstract forms). Notice how the
Trinity and the Touch tests can also be facilitated by Memoing.

• Level 3 interpretation: Drawing Conclusions, Implications and Recommendations

To draw conclusions is to verify if the research question has been answered through the
results and discussion corresponding to each objective. In qualitative research, to conclude is
to tighten the soundness of the theoretical claims gleaned from the analysis. Were the
original theoretical claims substantiated further in this particular study, were they
challenged, or were they expanded? To make sure that each research objective has been
achieved, the conclusion is made on each major section of the Results and Discussion part.

Meanwhile to elucidate the researcher’s implication and recommendations means to infer


on three areas: a.) Theoretical, b.) Methodological, and c.) Practical. Again, these inferences
are grounded on the in-depth analysis and second-level interpretations done preceding this
part.

80 © UP CMC Department of Communication Research


14.3. Qualitative research writing

According to Kathy Charmaz (2006), the written outputs of qualitative research “present the
form and content of the analytic work” (p. 151). As analysis in qualitative research happens all
throughout the research stages, writing the research is essentially a process of combining and
organizing the written memos to create a strong argument in support of the overall
interpretation or theoretical proposition regarding the particular communication phenomenon in
study. Thus, writing a qualitative research starts early on in the research process. It helps if you
sort out your memos well because towards the actual writing of the research report, you will
realize that some memos are suitable to frame the introduction, while some are useful in
theorizing and supporting your arguments, and some provide summaries which may be good for
the conclusion. But piecing them all together and presenting them in a written form certainly
requires strategies in effective writing: from the choice of words, the logic of the writing
structure, to as simple as using good transitions between sections.

It is helpful to mention that there are different forms of written qualitative communication
research. You may be asked to submit one as an essay for a class requirement, a journal article,
or a full-blown thesis. Thus, there is no one-format-fits-all formula in writing a qualitative
research.

While the structure is mostly the same as in any research output—it still essentially must contain
the introduction, the review of related literature, the theoretical framework, and the
methodology—there are some unique characteristics that differentiate it from quantitative
research writing. For example, while the writing of the review of related literature section
generally aims to demonstrate your grasp of relevant works, and to show the connection with
and refutation of extant knowledge, the writing of the theoretical framework is different for
qualitative research.

Offhand, a section on theoretical framework would seem counterproductive because we


understand qualitative research to be inductive (please see discussion on inductive and
deductive research in the earlier part of this section) and that there seems to be a
misunderstanding that writing a theoretical framework section automatically means a deductive
approach to research. However, every written qualitative research requires a theoretical
framework or engages some theoretical discussions (as in grounded theory or phenomenology)
as qualitative research still aims to show extant concepts and theories and how the present study
builds on them: to strengthen, clarify, contextualize, refute, or expand the theoretical framework
used for the study. While quantitative and qualitative research writings do share many
similarities, for the purpose of this chapter, the succeeding sections will focus on the other
unique features as well as practical tips on qualitative research writing.

14.3.1. First-person perspective

Qualitative research writing is often perceived to be subjective because it typically uses the first-
person perspective (i.e., use of the pronoun “I”). This is typical in qualitative research writing
(this is not to say that qualitative researchers do not utilize the third-person perspective) because
most researchers apply the experiential tone to emphasize the authors’ involvement in the data-
gathering and analysis processes, and that the authors are evidence themselves in this process of
persuading an audience of their theoretical proposition.

A Primer on Communication and Media Research 81


Qualitative research writing goes beyond just the reporting style, which strategically “distances”
the researcher from the research itself, typical of positivist research. For qualitative researchers,
using the first-person, experiential tone humanizes the written output. Writing the research as it
was “lived” by the authors shows the involvement of not only the researchers but also makes the
readers more accessible to and intimate with the human communication phenomenon being
studied.

14.3.2. Positionality

In qualitative research writing, the author’s “positionality” has to be declared. For example, if
you are to write your research about the communicative behaviors of students in the University
of the Philippines, you have to identify yourself as a UP student, an “insider” to the other
students that you would interview for the study and the culture or phenomenon that is being
studied. The awareness of this connection, which is not always a beneficial strategy but may even
prove to be a hindrance (e.g., other students might not treat the student-researcher seriously
because they are of the same cohort), highlights “[t]o the extent that there were similarities and
unspoken understandings between us, my position… was shaped not by an ‘ineffability of
difference’ (Visweswaran, 1994) but by the shifting, often overlapping, and sometimes
contradictory registers of our identities” (Mankekar, 1999, p. 34).

By being aware of and declaring our positionality, we are now able to grapple with the challenges
of qualitative research as inextricably subjective. This awareness, that we are a UP student but
also a researcher at the same time for example, enables us to practice trustworthiness, auditing,
authenticity, fairness, and confirmability more effectively and ensure the credibility of both
ourselves as researchers and those of our research outputs.

14.3.3. “Thick description” and metaphors

Writing qualitative research involves the use of thick descriptions (Geertz, 1973). This means
obtaining rich data through extensive writing of fieldnotes, observations, personal accounts, and
detailed narratives (Charmaz, 2006).

The use of figurative language may be seen as too informal for positivist and quantitative
research where concepts are clearly defined using simple and straightforward words for easier
measurement. However, metaphors and analogies are inevitable, if not essential, in qualitative
research writing especially those as expressed by research participants, as these reveal “tacit
meanings” and are, thus, considered data in themselves. Therefore, while qualitative research
writing uses metaphors and analogies leisurely, this does not make qualitative research any less
scientific. Moreover, metaphors and analogies add layers to the “thick description” that are
supposed to “unpack” communication phenomena being studied.

14.3.4. Writing as drafts

As in analysis and interpretation, writing is always a work in progress all throughout the stages of
qualitative research. In creating drafts of the research, in writing and rewriting, more discoveries
are made, and these discoveries are made more organized for stronger argumentation and
persuasion. In essence, in qualitative research, analysis, interpretation, and writing are
integrated: we write memos and drafts of the report to aid clearer and stronger analysis and, in
so doing, we start to create the written output to present our analysis and interpretation. This is
different from how quantitative researchers write their papers which requires that all data must
have been collected first, then analyzed using statistical tools, and only after can the data be
interpreted and then written or reported.

82 © UP CMC Department of Communication Research


Writing as drafts help build a more sound and meaningful qualitative research that aims for
theorization. While every qualitative writing involves writing as reporting, as thick descriptions, it
should not just end in mere descriptions. In the end, we want our readers to be persuaded about
the theory we are proposing or to accept our interpretation of a communication phenomenon. In
the process of writing or creating your drafts, ask yourself the following questions:
- What is it that you propose? Your research must propose an assumption, an interpretation,
or theoretical argument about a particular communication phenomenon. This must be
explicit in your written output (e.g., essay, journal article, thesis).
- What are the conceptual pegs or categories of this theoretical proposition? These
conceptualization and categorization are an output of the combination and piecing together
of your memos.
- What are the supporting details to these? The individual observations, accounts, and
narratives in your memos will provide the “thick description” to provide empirical basis to
your conceptualizations.

It may be cumbersome to organize the “thick descriptions” and elevate these into theories,
especially as the data in your research may be voluminous but writing in drafts will further
streamline this process of organization. Caution though: it is inevitable that you may encounter
interesting and colorful stories from your research participants that may entice you to include
everything in your writing. However, remember to only choose those that support your analysis
and main theoretical argument. The three question-structure above may help you remain on
focus.

14.3.5. So what?

In the end, “So what?”

You may encounter this question from a lot of your research teachers or even non-academic
readers and practitioners in communication, whenever they get to hear you talk about your
research ideas. As in quantitative research, you will always be asked about the relevance of your
qualitative study. This is also why qualitative research writing that stays on a “descriptive” level
fails to capture their readers; your research audience must be able to see “what’s in it for them?”
Elevating the understanding of communication phenomena on a theoretical level makes
knowledge significant and useful to as many as possible, not only to those being described by
certain quotes or descriptions used in the written research report. Effective qualitative research
writing must articulate this relevance as early as the Introduction but must be reiterated at the
end of the paper, after all the thick descriptions, categories, and concepts are built in between.

14.3.6. Other writing tips

It is helpful to use conceptual categories as headings in the body of the manuscript. However, be
careful to use these leisurely as they only make your writing too convoluted or even just “too
descriptive.” Use only the powerful ones, meaning those that provide stronger conceptual pegs
or signposts for your theoretical proposition. On the other hand, also be mindful that some
subcategories that seem to support these categories ought to be headings on their own. Be
judicious about this.

A Primer on Communication and Media Research 83


There are many approaches to this: you may want to use your conceptual categories and
headings based on some chronology or progressions in the communication phenomenon being
studied, or based on typologies emerging, or some hierarchy among the concepts created: a
range of superordinate to subordinate categories. Usually, categories and headings in qualitative
research writing are written as gerunds, or verbs that show action as categories and headings. In
the end, remember to be kind to yourself and know that, as qualitative research and its writing is
a work in progress, writing and creating the drafts of your final written output is a process of
filtering your conceptual categories or your section headings until you arrive at that draft where
the theoretical proposition is written and presented in the most compelling form.

Part of writing the drafts is to write in stages of persuasion. In writing to persuade people to
accept a theoretical proposal, persuaders themselves must be convinced about it first. Only after
being personally convinced may one add another layer of persuasion: rewriting one’s manuscript
depending on the audience: other students, researchers, professionals, or even just to convince a
teacher for a class requirement. In this rewriting part, think of the “so what?” and “what’s in it
for me?” as discussed earlier.

Finally, even as we consistently describe the processes and writing of qualitative research in
terms of how these differ from quantitative research, this does not necessarily mean that
qualitative research writing totally excludes quantitative data reporting or presentation.

Tashakkori and Teddlie (2003) explain that in the mixed methods approach, this integration of
quantitative and qualitative approaches occurs in three stages in the research design:
conceptualization stage, experiential stage (methodological/analytical), and the inferential stage,
and that combining the two approaches allows the researcher to obtain and analyze
complementary sets of data, both the depth and breadth of the communication phenomenon
being studied. Thus, qualitative research writing may still present matrices and tables (such as
summaries of profiles or mean scores of perceptions or attitudes of the research participants,
etc.), but qualitative research certainly highlights and focuses more on the “stories behind the
numbers.”

A note on translations

As qualitative data analysis involves the management of stories, narratives, anecdotes,


utterances, etc., it would be inevitable to deal with transcriptions and translations in the conduct
of our research. To encourage participation and responses from our local interviewees, it is wise
to conduct our interviews in the local language, the transcripts of which will have to be
translated especially when we are writing our research for an English-reading audience.

Indeed, there are practical, methodological concerns involved in doing translations for research
writing. These include a) the availability of translator that speaks both the local language and
English and b) the cost and time needed for translation. However, doing translations also
involves fundamental epistemological issues. For instance, are meanings lost in the process of
translation? In instances when translators are externally commissioned by the researcher, are
contexts missed out in the course of translation? Finally, should they be involved in the data
analysis. If so, what should be their level of participation?

84 © UP CMC Department of Communication Research


One way to address this issue is to be apparent about the issue of bias or, in this case, “the loss in
translation.” Just like declaring one’s positionality in research, it would help if you help the
readers understand how you have managed the process of translation. Provide answers to the
following:
- What language was the data collected in?
- Were the data transcribed and then translated?
- Who did the transcription and translation? When were these conducted?
- What issues surfaced during the process of translation? How did you deal with these issues?

In the end, your readers knowing that you are aware of and managed translation issues in your
study is better than them thinking that you are feigning ignorance of these issues and keeping
them in the dark.

14.4. Computer software for qualitative data analysis

14.4.1. What computers can do in aid of analysis

Computer Assisted/Aided Qualitative Data Analysis Software or CAQDAS, as the long name says,
simply means the application or use of software packages for qualitative data analysis. These
software packages offer tools to assist the users in carrying out the essential processes in
qualitative research, as discussed above: from recording and organizing data, creating memos,
assigning codes, to sorting codes into higher levels of analysis such as categories and themes and
other analytic styles, as well as creating visual presentations of these data.

Since the flourishing of a wide range of software programs especially in the 1980s, each with
unique functions and features, scholars and research professionals alike have increasingly utilized
CAQDAS as a welcome support to the highly complex tasks in qualitative data analysis.
After qualitative data have been digitized as a Word or PDF document, an Excel or SPSS table, or
a set of image, audio or video files, CAQDAS allows researchers to upload and store these data
for use in a particular research project.

Even prior to storing, CAQDAS also allows recording of data directly, as qualitative research
nowadays not only involves digitized transcripts of interviews or captured photos, audio, and
video materials during participant observations, but also social media data like blogs, Tweets,
YouTube Comments, or Facebook posts, which may be retrieved directly and in real-time (web
scraping) from the Internet. In short, CAQDAS assists researchers in recording, storing, querying
or searching, and retrieving information for further processing and eventual analysis. Thus,
CAQDAS is seen as beneficial for researchers as it allows unobtrusive and objective management
of data. Although documents, for instance, may be added a code or memo, the original texts or
documents are maintained separately.

Moreover, it is more efficient than non-computer assisted analysis because it allows the faster
organization of big volumes of data with lesser resources: the researcher just needs to “train” the
computer program and the program will take over the manual and clerical tasks expected of the
researcher. Researchers also benefit from the flexibility of CAQDAS, as the digitally organized
data easily allow adding and appending of newer data, especially as data generation in
qualitative research happens all throughout the research process.

A Primer on Communication and Media Research 85


CAQDAS also allows processing of raw qualitative data for “cleaner” analysis. It allows functions
such as the removal of punctuations and stop words (words that carry little meaning, such as
articles “the,” “a,” and conjunction “and”). It also automates the assignment of codes and then
also assists in classifying and clustering of these codes (for example by training the machine to
classify or cluster according to similar or discriminant words) to generate categories or themes or
to generate higher level abstractions or interpretations. Also, CAQDAS allows text to be coded as
numbers too, for supplementary quantitative analysis: from as simple as computing frequencies
for key words, to generating regression models from unstructured text.

Computer software also provide effective tools for data visualization: plotting, diagraming,
mapping, etc., which may aid analysis (“walling”) as well as presentation of analyzed qualitative
data. Through data visualization, CAQDAS helps qualitative researchers become better
storytellers.

14.4.2. What they cannot do

While it is now clear how computer software can aid qualitative data analysis, they are not
without limitations and challenges in terms of dealing with and interpreting qualitative data.
Researchers caution that the very mechanistic and systematic processes of computer software
lead to the reifying, reductionist, and maybe even deterministic manner of processing qualitative
data, which rids of the entire process of the essence of context and the explanatory account of
the participants in the research. Managing qualitative data as volumes of texts may result to a
“quantification” mindset instead of digging into the depth of the narratives and their meanings.

The use of CAQDAS in research and knowledge creation is also seen as commercialization of
knowledge as the use of CAQDAS certainly depends on technology availability, affordance, access,
and know-how. In the Philippines, there is still a stark first- and second-level divide between
researchers in the academe and the private sector. As most of these programs are not free, only
most businesses and market research companies are able to gain licensing for these programs, as
well as the needed trainings and workshops on the use of these computer packages. As for
researchers who can obtain free software or trial versions, a lot of time and energy may be
diverted to the familiarization of the software than on the actual data analysis work.

86 © UP CMC Department of Communication Research


14.4.3. Which software do I use?

There are many computer-assisted qualitative data analysis software that are currently available
to researchers. Because each of these packages boasts of varying tools, features, and even
interfaces, it is recommended to study and maybe do a test run of each program to determine
the best package suited for your own research. For research students working on a tight budget,
it would help to know which ones are a freeware (e.g., Transana), or those that have free limited
editions (e.g., HyperResearch), and which ones have trial versions (typically, paid software of)
which may be accessed for limited periods.

Another factor to consider when choosing which software works best for you is to consider the
methodological approach for your research. Although all of these packages may be used for, say,
textual analysis, some are more suitable for other qualitative research approaches. For example,
NVivo and ATLAS.ti are seen to be the most popular for those using Grounded Theory; while
DICTION, HyperRESEARCH, and Transana, because of their better functionality with audio and
video data, are deemed to be useful for Conversation Analysis; and Ethnograph is employed in
ethnographic studies.

A Primer on Communication and Media Research 87


15. QUANTITATIVE DATA ANALYSIS AND INTERPRETATION
by Associate Professor Ma. Rosel S. San Pascual, PhD & Assistant Professor Jon Benedik A.
Bunquin, MA

15.1. Overview

15.1.1. Purpose of quantitative data analysis

Quantitative analysis helps you make sense of gathered observations by a) describing the sample
where the observations have been generated and b) by assessing whether you could make use of
your descriptions of the sample to make inferences about the population where the sample was
derived. In other words, you may conduct quantitative analysis for a) descriptive purposes to
describe the sample by counting the recurrence of observations gathered from the sample or b)
inferential purposes to make conclusions about the population using descriptions of the sample.

15.1.2. Nature and sources of data for quantitative data analysis

Consistent with the positivist paradigm, which is ontologically predisposed to perceive reality
objectively and is epistemologically designed to generate objective observations, you are then
tasked to objectively analyze these observations through quantitative data analysis tools. This
enhances the replicability of data analysis and the reliability of research results.

Data for quantitative analysis are derived from research methods under the positivist paradigm:
content analysis, experiment, and survey. Since quantitative analysis processes numerical data,
to subject your observations to quantitative analysis, values must be numeric. If values are non-
numeric, then these values must first be transformed into numeric codes.

The positivist paradigm generates values from objective observations of your study’s variables
that may either be inherently numeric (i.e., average monthly family income) or categorical (e.g.,
gender, which is conventionally categorized into either female or male). Thus, while inherently
numeric values are already prime for quantitative analysis, you must first transform categorical
values into numeric codes (e.g., the conventional categories of gender should be numerically
transformed to “1” for female and “2” for male) before conducting quantitative analysis.

Inherently numeric values are derived from variables that are measured at the interval or ratio
level while categorical values are derived from variables that are measured at the nominal or
ordinal level. Table 1 presents sample variables, the range of observations of these sample
variables, their level of measurement, and the numeric processing involved to prepare them for
quantitative analysis:

88 © UP CMC Department of Communication Research


Table 1. Example of variables, range of observations, level of measurement, and numeric processing
Level of Range of
Sample Variables Numeric Processing
Measurement Observations
Female Values are categorical and must be transformed into
Nominal Gender
Male numeric codes: Assign “1” for Female, “2” for Male.
Values are categorical and must be transformed into
Summa cum laude
numeric codes which should also reflect the values’
Magna cum laude
Ordinal Latin honor standing increasing order in the array: Assign “0” for None, “1”
Cum laude
for Cum laude, “2” for Magna cum laude, and “3” for
None
Summa cum laude.
From 1.0 (highest
UP General Weighted possible grade) to
Interval Values are numeric and may be analyzed as is.
Average 5.0 (lowest
possible grade)
Number of affiliations in At least 0 (no
Ratio Values are numeric and may be analyzed as is.
student organization affiliation)

Using Secondary Data


Some research studies make use of secondary data, or data which have been previously collected by other researchers for
a different study, as opposed to primary data, or data collected first-hand. Secondary data could be datasets that are
publicly available online or existing datasets from other institutional research projects made available for students or
institutional researchers. Secondary data save researchers resources as they eliminate the need to conduct data gathering.
Moreover, secondary datasets usually come from large sample sizes and are typically part of longitudinal research
projects. Thus, there are a lot of possible research projects that could be created from these types of datasets. However,
since the data did not conceptualize the research project from which the secondary data came, researchers must familiar
themselves with the dataset—how it was collected, who the population was, what were the objectives of the study, what
was the research instrument used. Moreover, researchers may not find all the data their project requires in one secondary
dataset.

15.1.3. Basic guiding principles

Because there is a vast array of quantitative data analysis tools, it can get truly overwhelming
and intimidating when you are faced with the menu of statistical tests. This section presents
three basic guiding principles that are going to help you choose the most appropriate statistical
test, which depend on whether a) you are analyzing data derived from a representative or non-
representative sample, b) you are aiming to empirically test for association or to empirically test
for comparison, and c) you are testing variable/s at the nominal, ordinal, or interval/ratio level.

• Data derived from either representative or non-representative sample

Quantitative analysis predominantly deals with observations derived from a sample.


Meanwhile, quantitative data analysis tools are typically classified in terms of whether the
sample a) would permit making inferences about the population or b) is simply limited to a
description of the sample.

A Primer on Communication and Media Research 89


A sample that allows us to use inferential statistics to inquire about the population is a
representative sample, which is a sample that is adequately sized and randomly drawn. The
Survey section earlier presents how a representative sample may be generated. However, if
the sample is not representative of the population, it is not suitable for inferential testing,
and the non-representative sample would only allow us to use descriptive statistics to
generate information about the sample at hand. A different set of statistical tools is
classified under descriptive and inferential statistics.

So, ask yourself, are you working with data generated from a representative or non-
representative sample? Continue reading to answer this question.

• Testing for association or for comparison

Quantitative analysis supports the positivist paradigm of empirically validating what we


theoretically know about an objective reality. What we theoretically know are typically
articulated in the form of research hypotheses, which are statements that postulate some
association between variables or statements that posit some variable comparison between
or across groups.

A set of quantitative data analysis tools are designed to test the empirical validity of
postulated associations and a set of quantitative data analysis tools are designed to test the
empirical validity of posited group comparisons.

• Testing variable/s at the nominal, ordinal, or interval/ratio level.

The choice of a specific statistical test for association or comparison depends on a) the level
of measurement of the variables you are pairing in the case of testing for association and b)
the level of measurement of the dependent (comparison) variable in the case of testing for
comparison. The corresponding level of measurement for a variable has implications on the
mathematical operation that can be performed when analyzing the data for that variable.

a. Nominal level—A variable is classified as nominal when it is translated into a measure


composed of categories that cannot be arrayed. Data gathered for a nominal level
variable are classified according to the categorical measure of that variable. After which,
the number of cases classified per category is counted.

b. Ordinal level—A variable is classified as ordinal when it is translated into a measure


composed of categories that can be arrayed. Apart from counting classified cases per
category for data gathered for ordinal level variables, the categories may also be
arrayed and ranked.

c. Interval level—A variable is classified as interval when it is translated into a measure


with equidistant numeric set of values. Data gathered for an interval level variable may
be subjected to addition, subtraction, multiplication, and division.

d. Ratio level—A variable is classified as ratio when it is translated into a numeric measure
with equidistant numeric set of values and where the “zero” value means that the
characteristic being measured does not exist. Data gathered for ratio level variables may
also be subjected to the same set of tests as interval level variables: addition,
subtraction, multiplication, and division. Thus, variables measured at the interval or
ratio level are at times referred to as scale level variables.

90 © UP CMC Department of Communication Research


- So, depending on whether you are testing for association, for comparison, or both, you now
ask yourself:
- If you are going to test for association, what variables are you going to and what is/are
the level/s of measurement of these variables?
- If you are going to test for comparison, what is the level of measurement of your
dependent (comparison) variable?

Table 2. Levels of measurement and their corresponding mathematical operations


Level of Sample
Mathematical Operations Sample Measures Sample Operation
Measurement Variables
Categorical classification of - Female = 250 respondents
cases (62.5%)
Female
Nominal Gender - Male = 150 respondents
Male
Counting the number of cases (37.5%)
per category N = 400 (100%)
Categories in decreasing order:
- Summa cum laude = 1
respondent (0.25%)
Summa cum laude
All mathematical operations - Magna cum laude =
Latin honor Magna cum laude
Ordinal applicable to nominal level plus - 15 respondents (3.75%)
standing Cum laude
ranking of categories - Cum laude = 30 respondents
None
(7.5%)
- None = 354 (88.5%)
N = 400 (100%)
All mathematical operations From 1.0 (highest
UP General - Average UP GWA of
applicable to ordinal level plus possible grade) to
Interval Weighted respondents = 2.25
addition, subtraction, 5.0 (lowest
Average - Standard deviation = 0.5
multiplication, and division possible grade)
Number of - Average number of
All mathematical operations affiliations in At least 0 (no affiliations = 2 organizations
Ratio
applicable to interval level student affiliation) - Standard deviation = 1
organization organization

15.2. Key concepts

15.2.1. Descriptive statistics

As the name implies, descriptive statistics is the branch of statistics that deals with description of
the sample through univariate, bivariate, and multivariate analysis. This categorization relates to
the number of variables used to describe a sample at a time:

a. Univariate analysis describes the sample one variable at a time

b. Bivariate analysis describes the sample through analysis of the association between pairs of
variables

c. Multivariate analysis describes the sample through analysis of the association of a set of
variables

A Primer on Communication and Media Research 91


Using PSPP for statistical analysis
Statistical analysis computer programs cost a lot. Thankfully, we have a number of open source (or free) software available
for everyone online. One of these programs is PSPP. It’s a play on the software SPSS, one of the most popular statistical
analysis programs developed by IBM.

PSPP appears similar to SPSS, but, unlike the latter, PSPP is 100% free. Developed by GNU, PSPP allows its users to perform
common statistical analysis techniques, such as descriptive statistics, T-tests, ANOVA, linear and logistic regressions, and
more. It can perform these operations on large amounts of cases, and its designed similar to how SPSS would do these
statistical operations.

PSPP also has a point-and-click interface. Users navigate their way through windows and perform commands by clicking
through various menus and options found in its interface. Meanwhile, users who prefer entering commands through code
will be able to do so as well, since PSPP also has a syntax window.

Throughout this section, we’ll be learning some of the basic statistical operations that you can perform in PSPP. Thus, if
you have not installed PSPP yet in your computers, visit this link for the download details:
https://www.gnu.org/software/pspp/get.html

PSPP is comprised of three windows, namely:

- Data Editor—This window contains your data and variables. The DATA VIEW displays your actual data. Cases are laid
out per rows, and Variables are laid out per column. The VARIABLE VIEW shows the list of each variable in the
dataset, including its type, label, value labels/categories, missing values, measure, and role.

- Syntax Editor—This window contains the terminal in which you could input PSPP commands through code. In the
Data Editor, click FILE > NEW > SYNTAX to open the syntax editor.

- Output Viewer—This window shows all actions performed in SPSS. It will automatically open once you perform
anything, such as opening a file, performing analysis, or running a syntax command. The output window also displays
the results of your analysis.

Why use syntax?

While most PSPP users do not really use the syntax editor, this window is a powerful tool to use the program more
efficiently. Coding allows users to document their operations. Saving the code, meanwhile, allows the syntax to be reused
by the coder, thereby eliminating time spent looking for buttons in the menu. In some instances, typing saves time, as
opposed to the point and click method.

Popular tools for univariate descriptive analysis

Univariate analysis covers a range of statistical tests, such as data distribution, measures of
central tendency, and measures of variability. Data distribution describes the sample one
variable at a time using a set of numbers, typically in the form of frequency count and percent
distribution and displayed through tables, graphs, or charts. Meanwhile, measures of central
tendency and measures of variability use a summarizing technique to describe the sample, one
variable at a time, using a single number that most aptly describes the variable under
consideration. Check out the guide on how to use PSPP for univariate descriptive analysis.

92 © UP CMC Department of Communication Research


• Measures of Central Tendency

Measures of central tendency include mode, median, and mean. The choice of which
measure of central tendency to use depends on the level of measurement of the variable
under consideration, given that each level of measurement has a set of properties that
define the scope of permissible mathematical operations for quantitative data analysis.

The mathematical operations permitted for nominal level variables include categorical
classification of cases and the counting of the number of cases per category. Hence, the
mode is the only appropriate measure of central tendency for nominal level variables as the
mode identifies the most frequently occurring value or category of a variable.

As a level of measurement higher than nominal, the mathematical operations permitted for
ordinal level variables cover all mathematical operations applicable to nominal level plus
ranking of categories. As such, the median is the most appropriate measure of central
tendency for ordinal level variables as it identifies the central category in an array.

As the highest levels of measurement, interval and ratio level variables permit the use of all
mathematical operations applicable to nominal and ordinal levels plus addition, subtraction,
multiplication, and division. As such, mean is only appropriate for interval and ratio level
variables as the computation of mean requires addition and division.

• Measures of Variability

As with the measures of central tendency, the choice of which measure of variability to use
depends on the level of measurement of the variable under consideration. While there are
various statistics included in measures of variability, the most common measures cover
range and standard deviation.

Range identifies the distance between the highest value and the lowest value through
subtraction. Meanwhile, standard deviation identifies the average distance of values from
the mean. Its computation requires addition, multiplication, and division. Given the required
mathematical operations for the computation of range and standard deviation, these
measures of variability are only appropriate for interval and ratio level variables.

Table 3. Levels of measurement and measures of central tendency and variability


Most Appropriate
Level of Most Appropriate
Mathematical Operations Measure of Central
Measurement Measure of Variability
Tendency
Categorical classification of cases
Nominal Mode (Mo) --
Counting the number of cases per category
All mathematical operations applicable to
Ordinal Median (Md) Semi-interquartile range
nominal level plus ranking of categories
All mathematical operations applicable to
Interval ordinal level plus addition, subtraction, Mean (M) Standard deviation (SD)
multiplication, and division
All mathematical operations applicable to
Ratio Mean (M) Standard deviation (SD)
interval level

A Primer on Communication and Media Research 93


Popular tools for bivariate descriptive analysis

Bivariate descriptive analysis is designed to test and measure the association between a pair of
variables by describing the presence of relationship between the two variables, its strength, and,
if applicable, its direction. Depending on the strength of the bivariate relationship, results of
measures of association test may also allow us to predict the probability of occurrence of one
variable due to the presence of the other related variable.

The choice of which measure of association to use depends on the level of measurement of the
variables under consideration, given that each level of measurement has a set of properties that
define the scope of permissible mathematical operations for quantitative data analysis. As a rule,
the choice of measure of association to use should match the variable in the pair with the lower
level of measurement.

Table 4. Levels of measurement and corresponding measures of association


Level of Measurement Measure of Association
When there is a nominal level variable in the pair Phi (φ), Cramer’s V (V), Lambda (λ)
When the pair contains at least an ordinal level variable Gamma (G), Spearman’s Rho (rs)
When the pair only contains interval/ratio level variable Pearson’s R (r)

Results of measures of association test would yield a single value ranging from 0 to 1, if tests for
nominal variable are used, or from -1 to +1, if tests for ordinal, interval, and ratio level variables
are used. Statistical test result indicates:

• Presence of association

A non-zero result indicates that there is some degree of association between the pair of
variables. A zero result indicates absence of relationship.

• Strength of association

The closer the value of the result is to 0, the weaker the relationship between the pair of
variables; the closer the value of the result is to 1 (or +/- 1), the stronger the relationship
between the pair of variables.

• Direction of association

If the pair contains at least an ordinal level variable, the positive or negative sign prefixing
the value of the result indicates the direction of the relationship. A positive sign indicates
that the pair moves in the same direction (i.e., as one variable in the pair increases, the
other increases as well, or vice versa). On the other hand, a negative sign indicates that the
pair moves in opposite direction (i.e., as one variable in the pair increases, the other
decreases).

Moreover, the corresponding significance value (“p”-value or probability of error) of


bivariate statistical test may also be examined. The section on inferential statistics will
present how significance value complements the results of bivariate statistics.

94 © UP CMC Department of Communication Research


Popular tools for multivariate descriptive analysis

Multivariate descriptive analysis is designed to measure how a set of independent variables help
explain the occurrence of one dependent variable. Multiple correlation (R2) indicates how a set
of independent variables altogether explains the presence of one dependent variable. Typically,
several sets of independent variables, commonly referred to as models, are presented and the
model with the highest multiple correlation value (R2) is considered the model that best explains
the dependent variable.

Meanwhile, multiple regression, particularly the standardized coefficient (ß), indicates how much
each independent variable in a set contributes to the explanation of the occurrence of one
dependent variable. The standardized coefficient identifies which among the independent
variables included in each model offers the strongest explanation on the presence of the
dependent variable.

Apart from reviewing the results of multiple correlation and the standardized coefficient, their
corresponding significance value (“p”-value or probability of error) may also be examined. The
section on inferential statistics will present how significance value complements the results of
multiple correlation and multiple regression.

15.2.2. Inferential statistics

Inferential statistics is the branch of statistics that deals with making inferences about the
population based on findings from a representative sample. Since inferential statistics aims to
make inferences about the population using sample data, it is imperative that the sample should
be representative of the population so that it could generate the best possible estimates of the
population characteristics. The external validity of results from inferential statistics—in other
words, the generalizability of results—depends on the representativeness of the sample. A
representative sample is an adequately sized sample drawn through probability sampling
methods.

Basically, results of inferential statistical tests indicate whether descriptions of the representative
sample are statistically significant so that the said results could be used to describe the
population as well. On the contrary, if results are not statistically significant, then descriptions of
the representative sample cannot be generalized to the population.

In making inferences about the population, statistically significant results are interpreted based
on a tolerable amount of error. What does this mean? Even if results are statistically significant,
there is an acceptable degree of probability that the results would not match the true population
value. This tolerable amount of error (“α”) is the inverse of confidence level. In the social
sciences, confidence level is typically set at 95% with a resulting tolerable amount of error of 5%
(“α = 0.05”). Hence, for results to be considered statistically significant at 95% confidence level,
the probability of error (“p”) should ideally be less than 5% (“p < 0.05”).

Inferential statistics encompass parametric and nonparametric analysis. Choosing between the
parametric or nonparametric track depends on the level of measurement of the dependent
variable being compared. When the dependent variables are interval or ratio level, parametric
statics may be used. However, when the dependent variables are nominal or ordinal,
nonparametric statistics should be employed.

A Primer on Communication and Media Research 95


Parametric and nonparametric statistical tests are classified based on the number of sub-samples
being compared – one-sample case, two sample case, and multiple sample case:

a. In one-sample case, the descriptive statistical result computed from the whole sample is
tested for statistical significance. If results are statistically significant (i.e., at 95% confidence
level, p < 0.05), then it could be inferred that the descriptive statistical result derived from
the sample holds true for the population as well.

b. There are two classifications of samples in a two-sample case—independent and related


samples. Samples are considered independent when the entire sample is divided into two
mutually exclusive sub-samples such that members of each sub-sample are classified in one
and only one sub-sample (i.e., either female or male group). Meanwhile, samples are
considered related when the entire sample is observed twice, and each observation is taken
as one sample. Related samples are typically used in pretest-posttest research design. In
both cases, the descriptive statistical result computed from the two samples are compared
and the differences are assessed for statistical significance. If the difference between the
two samples is statistically significant (i.e., at 95% confidence level, p < 0.05), then it could
be inferred that the difference found in the two samples could also be found in the
population.

c. Similar to the two-sample case, there are two classifications of samples in a multiple sample
case – independent and related samples. In multiple independent samples, the entire
sample is subdivided into at least three mutually exclusive sub-samples. In multiple related
samples, the entire sample is observed at least thrice, and each observation is considered as
one sample. In both instances, the descriptive statistical results computed from the multiple
samples are compared and the differences are assessed for statistical significance. If the
difference in the multiple samples is statistically significant (i.e., at 95% confidence level, p <
0.05), then it could be inferred that the difference found in the multiple samples could also
be found in the population.

Essentially, if results from either parametric or nonparametric statistical tests are not statistically
significant, such as when the resulting probability of error is greater than 5% (p > 0.05), then the
descriptive statistical results computed from the sample could only be used to describe the
sample and could not be used to make inferences about the population.

Popular parametric inferential statistical tests

Parametric statistical tests are used when the level of measurement of the dependent variables
being compared is interval or ratio level. The choice of which parametric statistical test to use
depends on the number of samples being compared:

Table 5. Common parametric statistical tests


Number of samples compared Parametric Statistical Tests
One sample One sample t-test (t)
Two independent samples Independent samples t-test (t)
Two related samples Paired samples t-test (t)
Multiple independent samples One-way ANOVA (F)
Multiple related samples Repeated measures ANOVA (F)

96 © UP CMC Department of Communication Research


Popular nonparametric inferential statistical tests

Nonparametric statistical tests are used when the level of measurement of the dependent
variables being compared is nominal or ordinal level. As with parametric statistics, the choice of
which nonparametric statistical test to use depends on the number of samples being compared:

Table 6. Common nonparametric statistical tests


Number of samples compared Nonparametric Statistical Tests
One sample Chi-square goodness of fit (X2)
Chi-square test for independence (X2)
Two independent samples
Mann-Whitney U (U)
Chi-square test for independence (X2)
Two related samples
Wilcoxon T (T)
Chi-square test for independence (X2)
Multiple independent samples
Kruskall-Wallis H (H)
Multiple related samples Friedman (Xr2)

15.2.3. The process of quantitative data analysis

Quantitative data analysis is a very linear process, from examining the accomplished
questionnaires for accuracy and validity, to developing a coding guide, to constructing a datafile,
to data encoding and data cleaning, to analyzing the data. Nowadays, quantitative data analysis
is facilitated by statistical analysis software where data can be encoded, processed, and analyzed.
In this section, we take you through the various steps involved in quantitative analysis of data
and steps in performing quantitative analysis in PSPP. These steps are articulated in a box.

• Examining the accomplished questionnaires

Before anything else, you should examine each accomplished questionnaire to make sure
that all the items have been properly and clearly accomplished. Ideally, questionnaires
should be examined while data gathering is still being conducted, so that any vague or
incorrect response may still be clarified from the survey respondents or experiment
participants or addressed by content analysis coders. Otherwise, items that have been
improperly or vaguely accomplished will be considered as “Missing Response.”

• Developing a coding guide

A code guide lists down all the items in the questionnaire, their corresponding response
options, and the numeric code of each of the variable’s response options. Remember that
there are variables with inherently numeric values, and these numeric values are similarly
used as numeric codes. These variables are typically measured at the interval or ratio level.

However, for variables with categorical values, each categorical value of a variable is
transformed into a numeric code (i.e., the variable gender, with female and male as
conventional response options, is assigned with numeric codes “1” for female and “2” for
male). The assignment of numeric codes for variables with categorical values depends on
whether the variable is measured at the nominal or ordinal level. The assignment of numeric
codes for the response options of nominal level variables is arbitrary and the numbers
assigned are merely nominal; thus, these numbers do not carry numeric weight.

A Primer on Communication and Media Research 97


Meanwhile, the assignment of numeric codes for the response options of ordinal level
variables should follow some directional sequence so that the numbers also reflect an
increasing or decreasing order (i.e., the variable Latin honor standing, with cum laude,
magna cum laude, and summa cum laude as conventional response options, is assigned with
numeric code “1” for cum laude, “2” for magna cum laude, and “3” for summa cum laude, so
that the numeric codes also reflect the increasing order of the ordinal series).

Table 7. Sample Code Guide


Item Variable Numeric Codes
1 Name of the respondent Exact name
2 Age last birthday Exact numeric value
3 Educational attainment 0 No schooling
1 Some primary
2 Completed primary
3 Some high school
4 Complete high school
5 Vocational school
6 Some college
7 Completed college or higher
4 School/University last
Exact name of school/university
attended/currently attending
5 Marital status 1 Single
2 Married
3 Co-habiting
4 Separated
5 Widowed
6 Religion 0 None/Agnostic
1 Roman Catholic
2 Protestant
3 Christian
4 Iglesia ni Kristo
5 Islam
6 Others (specify):
7 Monthly household income 1 Less than PhP 10,000
2 PhP 10,000 – 29,999
3 PhP 30,000 – 49,999
4 PhP 50,000 – 69,999
5 More than PhP 70,000
8 What kind of group or 0 None
organization/s do you 1 Political party
currently belong in? (The 2 Trade union, business, or professional association
respondent may select more 3 Voluntary association
than one) 4 Religious organization
5 Sports/Leisure/Interest group
6 Others (specify)

98 © UP CMC Department of Communication Research


• Constructing the datafile

Quantitative data analysis is made much easier with the help of statistical analysis software,
which can handle huge amounts of data.

But before you can actually use a statistical analysis program for your study, you have to
construct a data file. Datafile construction refers to the transformation of items in a
questionnaire into a file that could be subjected to data processing and analysis. It entails
naming and defining the variables that will be analyzed in the software. Below is a box which
contains the fields needed to be filled up in creating a variable in PSPP.

DATAFILE CONSTRUCTION IN PSPP


The following fields are defined in the VARIABLE VIEW in PSPP:
• VARIABLE NAME, which is, as the name states, the name of your variable. Note that in PSPP, variable names cannot
contain spaces and other special characters. Alpha-numeric characters may be used, as well as periods (.) and
underscores (_). It is advisable to have a naming convention to make variables names more consistent. This also
makes the variable names easier to type in the SYNTAX editor.
• TYPE refers to the kind of variable that will be analyzed. The most frequently used variable types are NUMERIC and
STRING. Examples of NUMERIC are age, height, number of children. Numeric variables also include nominal variables
with categories that have an assigned numeric value. Assigning numeric values to nominal categories is done for
efficient encoding only, and not to show weights or numerical differences between categories. For example, the
nominal variable SEX could have two categories, FEMALE and MALE. If we assign 1 to FEMALE and 2 to MALE, then
that variable is defined as NUMERIC in PSPP). Meanwhile, STRING variables include as respondent’s name, and other
variables without an assigned numeric value.
• WIDTH refers to the number of characters allowed to be typed in the cell, specifically for STRING variables.
• DECIMAL refers to the number of decimal places that PSPP will display in the data view.
• LABEL is not the required field. However, it helps the coder identify the specific question or item that the variable
asks, which is why, typically, the questionnaire item is indicated in the LABEL field. PSPP also displays LABELS instead
of variable names when identifying variables to be included in analyses.
• VALUE LABELS is the field where numeric values are assigned. A new window appears once you click the (...) button.
Indicate the numerical assignment of the category in VALUE. Input the category name in the VALUE LABEL field and
click ADD. Finalize the categories by clicking OK.

Note that this kind of assignment will work for single-response items. Multiple response items are constructed
differently in PSPP; each category is defined as a separate variable and the value labels assigned are dichotomies (0 =
NO, 1 = YES), indicating whether the respondent chose such category or not. Hence, a multiple-response
questionnaire item with five categories will have five variables.
• MISSING VALUES specifies the numeric codes assigned that will be identified as MISSING by PSPP. Missing data can
skew numerical findings. To avoid including missing cases in performing data analysis, we assign a numerical code and
specify these in the MISSING VALUES field. 9, 99, or 999 are typically used as discrete values to encode missing data. A
missing value may also be identified as anything that falls within a range of values set in PSPP.

A Primer on Communication and Media Research 99


• COLUMN pertains to the column size of the variable displayed in DATA VIEW.
• ALIGN refers to the alignment of the encoded data in the variable.
• MEASURE refers to the variable’s level of measurement. As discussed earlier, these measures could be nominal,
ordinal, or interval/ratio. In PSPP, interval/ratio measures are referred to as a scale measure.
• ROLE specifies the use of the variable in the analysis. It could take on the following roles:
- INPUT (independent variable)
- OUTPUT (dependent or target variable)
- BOTH (both independent and dependent variable)
- NONE (no pre-identified or specific role)
- PARTITION and SPLIT (classifies the data into different samples)

Note that setting the role is only important for dialogues or add-ons that need these specific details. By default,
variables are set to INPUT, and common tests in PSPP can be run even if variable roles have not been specified.

• Encoding data and cleaning of encoded data

You are now ready to encode the responses from all the examined questionnaires to your
constructed data file. Using the numeric variables’ numeric values and the categorical
variables’ numeric codes, key in the appropriate number that corresponds to the
respondents’ response to each item in the questionnaire.

After encoding all the responses from all the accomplished questionnaires, check the
encoded data for stray codes or codes that fall beyond the expected range of answers (i.e.,
for the variable gender, which is conventionally categorized into either “1” for female or “2”
for male, any numeric values beyond “1” and “2” are considered as “stray”) and the correct
response should be encoded to replace the stray code.

• Processing Variables

Some variables are further processed before they are used for analysis. Processing variables
involves either of the following:
a. Recoding variables from their original numeric values or codes to make the range of
numeric or categorical values tighter (i.e., compressing the range of numeric values to
form segments or combining categories to form fewer categories), or
b. Computing an aggregate value, which is a single value that will represent a set of
indicators that measure a single variable (i.e., average comfortableness score). The
following box details the steps taken in recoding and computing variables.

100 © UP CMC Department of Communication Research


RECODING VARIABLES IN PSPP

There are two common ways to transform data: a) recoding and b) computing.

A. Recoding

Recoding entails transforming categories of a variable by grouping them into new categories. For exampling, age, with
values ranging from 10 to 50, would mean you would have 40 different categories, one per age year. What if we’re not
interested to know their ACTUAL age? What if we just want to find out the distribution of the respondents according to
three age groups:
- Adolescents: 10-19
- Young Adults: 20-29
- Adults: 30-50

How do you do this in PSPP?

Step 1: Click TRANSFORM. Under transform, you will see a number of techniques in transforming data and notice the two
options for recoding: a) recode into same variables, and b) recode into different variables. The two options have the same
function; it’s just that when you choose to recode into same variables, you will be overwriting your existing data, and
replacing it with your new categories. Your original data will be replaced with the new recategorized data. Recode into
different variables, on the other hand, will simply create a new variable and will retain the original coding of your data. So,
for now, let’s select recode into same variables.

Step 2: Select the variables to be recoded. Upon clicking the option, a dialogue box will appear where you will be
specifying the variables to be recoded, as well as transformed values of the variable.

The variables in your data set are listed on the left side of the dialogue box. Select AGE (v5) and click the  button.

Click the selected variable and inside the OUTPUT VARIABLE, specify the new variable name, and specify a label. Click
CHANGE once these details have been specified.

A Primer on Communication and Media Research 101


Step 3: Specify old and new values. After specifying the new variable name, click the OLD AND NEW VALUES button. This
will reveal a new dialogue box in which you will be specifying the new categories. There are 7 option for selecting values.
In the case of age, we will be using range, since we will be transforming a range of values into fewer categories.

Upon identifying the range, under NEW VALUE, select VALUE and type its new numerical value. Click ADD and do the rest
for the other range of values to be recoded.

Once old and new values have been identified, click CONTINUE.

Step 4: Check the data. Click OK to proceed with the recoding and check the data by running a frequencies test on the
recoded variable.

102 © UP CMC Department of Communication Research


B. Computing

Computing entails creating new variables as a result of performing operations on other variables. For example, you can get
composite scores based on items in a scale by either adding their responses or getting the overall mean score of the scale.
In some instances, you may even use specific formulas to compute new variables.

Let’s try to get the respondents’ mean composite scores based on the following variables from the Kompetent Siya
dataset: v12, v14, v16, v19, v21, v26, v28, v46, and v47.

Step 1: Click TRANSFORM > COMPUTE

Step 2: Name the target variable. Upon clicking transform, a dialogue box will open. Since we’re creating a new variable
by performing mathematical operations on the data, indicate the name of the new variable under TARGET VARIABLE.

Type in “MEAN_SOURCE” to indicate that this will be the mean scores for competence as sources in communication.

You may also set the variable properties by indicating the label and type of the variable. Click the “type and label” button
below the target variable to reveal this dialogue box.

A Primer on Communication and Media Research 103


Step 3: Specify the operation. You may select the operations from the list of functions on the right side of the dialogue
box. Alternatively, we can specify the operation using the numerical buttons and operations available in the dialogue box.

Since we’re getting the mean score of the scale, look for the following function in the list:

MEAN(number[, number]...)

Double click the function and this should appear in the NUMERIC EXPRESSION box. Next, look for the variables mentioned
earlier and double click them to put them in the expression. Separate each variable with the comma, and make sure that
the variables are still bounded by the parenthesis. You may also just manually type in the expression.

You should have something that looks like this:

MEAN(v14, v16, v19, v21, v26, v28, v46, v47)

Step 4: Perform the expression. After specifying the operation, click OK to run the expression or click PASTE to view its
syntax in the SYNTAX editor. Upon running the expression, you will notice a new variable in your variable list. Run
descriptive statistics to check the results of your operation.

• Analyzing Data

Quantitative data analysis software, like PSPP, is a powerful yet user-friendly tool that you
can use to analyze statistical data. PSPP offers a menu of statistical test that allows you to
quickly run statistics by plugging in the variables required for analysis.

To recap, ask yourself these questions before analyzing quantitative data:

a. Are you working with data generated from a representative or non-representative


sample? Inferential statistics require data generated from a representative sample while
descriptive statistics may be performed for data derived from either representative or
non-representative sample.

b. Are you going to test for association, for comparison, or are you going to do both? Tests
for association cover bivariate and multivariate statistics, which are tests under the
descriptive branch of statistics. Check out the box below to examine how to perform
descriptive statistics and measures of association in SPSS.

104 © UP CMC Department of Communication Research


DESCRIPTIVE STATISTICS
Univariate Descriptive Statistics

Step 1: Click ANALYZE > DESCRIPTIVE STATISTICS. Under ANALYZE in the menu bar, select DESCRIPTIVE STATISTICS.
This will reveal options for univariate analysis of data, namely, frequency, descriptive statistics, explore, and crosstabs.
Measures of central tendency (mean, median, and mode) and measures of variability (range and standard deviation)
may also be accessed under the Descriptive Statistics command.

Step 2: Click DESCRIPTIVE STATISTICS >


FREQUENCIES. In this section, we will focus
on the frequencies command to generate
descriptive statistics. This is done by selecting
frequencies under DESCRIPTIVE STATISTICS.
Note that both frequencies and descriptives
are used typically for univariate analysis of
data. The frequencies command, however,
will display the breakdown of responses per
category. Meanwhile, the descriptives
command only displays overall statistics, and
does not include media and mode measures.

Step 3: Select variables, measures, and other


output formats. Once you select frequencies,
you’ll notice a dialogue box, as shown in the
figure here. The left box presents the
variables in the study. In this box, select the
variables that you want to examine and click
the  button. This brings the selected
variables in the box of variables to be
analyzed. Meanwhile, under the statistics
box, you can specify which measures you
want to employ in the selected variables. For
measures of central tendency, check mean,
median, or mode. For measures of variability,
check standard deviation and range.

There are other options available in the


frequencies command. The charts option will
allow you to present the data through bar
charts, histograms, and pie charts. The
frequency tables option will let you specify
whether to display frequency tables or not, as
well as the order of the presentation of
categories. Finally, clicking reset will revert
the frequencies dialogue box to default.

A Primer on Communication and Media Research 105


Step 4: View the results. Once you have
selected the variables and measures, click OK
to display the results in the OUTPUT VIEWER.
Alternatively, you can click PASTE to reveal
the syntax of the command, and in the
SYNTAX EDITOR, highlight the code
generated. Under RUN, click selection.

In the frequency distribution table, the


frequency, percent, valid percent, and
cumulative percent are displayed. In reporting
findings, opt for the valid percent, as it
excludes invalid or missing data.

Meanwhile, the descriptive statistics table


displays the N or the sample size, the number
of valid and missing cases, followed by the
measures specified in the frequencies
dialogue box. Now, check out the descriptive
statistics of AGE, you should be able to get
the following table on the right:

Step 5: Report the results. The table tells us that that the mean age of the respondents is 41.07 years (x̅ = 41.07, 𝑁 =
1072). Now that you know how to run some basic statistical tests, let’s try to examine the dataset that you have and
perform some basic operations.

106 © UP CMC Department of Communication Research


TESTING FOR ASSOCIATION THROUGH BIVARIATE AND MULTIVARIATE STATISTICS
Testing for association using the Crosstab function

All tests for association in PSPP can be


performed using the crosstabs command.

Step 1: Click ANALYZE > DESCRIPTIVE


STATISTICS. Under ANALYZE in the menu
bar, select DESCRIPTIVE STATISTICS, and
then click crosstabs.

Step 2: Select the variables. Clicking


crosstabs reveals the crosstabs dialogue box.
It is important that the dependent and
independent variables are properly
identified. Dependent variables are set as
ROWS in the cross tabulations, while
independent variables are set as COLUMNS.

Step 3: Choose the appropriate measure of


association. Once the variables to be cross
tabulated have been identified and sorted,
click STATISTICS to select the measures of
association you want to perform. It is
important, again, to know the levels of
measure of the variables in choosing the
most appropriate statistical test to perform.
Note that you can perform multiple tests of
association in one statistical run.

Click CONTINUE once you’re done choosing


the appropriate test.

A Primer on Communication and Media Research 107


Step 4: Select the data to be displayed in the
cells. After selecting the statistical test,
select CELLS to choose which data you want
to display in the cross tabulation.

Minimize the kind of data you want to display in the crosstabs to avoid confusion in reading and interpreting them. Be
guided by the following when choosing the data, you want to be displayed in your crosstabs.

Count:
Categories Column1 Column2 Row Total
Row1 a b a+b
Row2 c d c+d
Column Total a+c b+d a+b+c+d

Row:
Categories Column1 Column2 Row Total
Row1 a b a+b
Row1 % a/(a + b) b/(a+b) (a + b)/(a + b) = 100%
Row2 c d c+d
Row2 % c/(c + d) d/(c + d) (c + d)/(c + d) = 100%
Column Total a+c b+d a+b+c+d
(a + b + c + d)/( a + b + c +
% of total (a + c)/(a + b + c + d) (b + d)/(a + b + c + d)
d) = 100%

Column:
Categories Column1 Column2 Row Total
Row1 a b a+b
Column1 % a/(a + c) b/(b + d) (a + b)/(a + b + c + d)
Row2 c d c+d
Column2 % a/(a + c) b/(b + d) (c + d)/(a + b + c + d)
Column Total a+c b+d a+b+c+d
(a + b + c + d)/( a + b + c +
% of total (a + c)/(a + c) = 100% (b + d)/(b + d) = 100%
d) = 100%
SOURCE: http://libguides.library.kent.edu/SPSS/Crosstabs

108 © UP CMC Department of Communication Research


Step 4: View the results. Click CONTINUE
after selecting the data to be displayed in the
cells. Finally, click OK to view the results, or
click PASTE to view the syntax before running
it in the SYNTAX editor.

The OUTPUT viewer will display multiple


tables. The first table will display the
summary of the results, which includes the
valid cases, missing cases, and the total
number of cases run in the analysis. The
second table will be the cross tabulation of
your variables, as shown here using
competence in making small talk
(v14_recode) and sex (v4).

The specified measures of association will be displayed after the cross-tabulated data:

The Chi-square tests table, using two categorical variables, competence in making small talk (v14_recode) and sex
(v4):

In reading this table, we are interested to see the Pearson Chi-Square values. This indicates if there are statistically
significant associations between sex and competence in making small talk. The results show that there are no
significant associations between these variables, as indicated in the asymptotic significance (2-tailed) column.

The Symmetric Measures table, using the same variables:

Phi and Cramer’s V, meanwhile, measure the strength of association between the variable pairs. For 2x2 cross tabs
(i.e., two rows and two columns), we look at Phi values. For crosstabs with more than 2 rows and 2 columns, we refer
to Cramer’s V.

Phi and Cramer’s V scores range from 0 (no association) to 1 (perfect association). As indicated in Phi value, the
association between sex and competence as a communication participant in a small group is weak.

Step 5: Report the results. The results may be written in this manner:

There is no significant association between sex and competence in making small talk (V = .05). Sex is not associated
with one’s competence in making small talk during parties.

A Primer on Communication and Media Research 109


TESTING FOR ASSOCIATION THROUGH BIVARIATE AND MULTIVARIATE STATISTICS
Testing for association using Pearson’s R Correlation

As discussed previously, correlations tests are


performed between quantitative variables, or those
at the interval or ratio levels of measure. It examines
linear relationships between two variables.

Step 1: Click ANALYZE > BIVARIATE CORRELATION.


Under ANALYZE in the menu bar, select BIVARIATE
CORRELATION. This reveals the bivariate correlations
dialogue box.

Step 2: Select the variables. Select at least two


variables that you want to examine for correlations
and click the  button. Below those boxes are two
options for test of significance. If your research
hypothesis does not specify directionality of the
relationships, select two-tailed. If your hypothesis
states either a positive/direct or a negative/inverse
correlation, then select one-tailed.

At the bottom of the dialogue box, put a check in the


box beside “Flag Significant Correlations.” This
prompts PSPP to mark correlations that are
statistically significant or with p values less than
0.05.

Let’s try to analyze the correlation between AGE (v5)


and Mean Composite Score for Competence in
Speaking (tsk_spk).

Step 3: View the results. Upon selecting the variables, click OK. Display the syntax by clicking PASTE. A correlation
matrix will be displayed in the OUTPUT editor. Remember, correlations do not specify causal order of relationships.
Correlation is not causation; it can only specify the direction of relationship, whether direct/positive, or
inverse/negative.

Upon running the correlation test for age and mean composite scores for competence in speaking, you should have
the correlation matrix below.

110 © UP CMC Department of Communication Research


Step 4: Report the results. The results may be written in this manner:

The results of the correlation test indicate that there is a weak, inverse* relationship between age and their
competence in speaking, and this relationship is significant** (r = -.12, p < .001). As age increases by 1, the mean
composite score of the respondents decreases by 0.12. This indicates that as respondents age, they are more likely to
be less competent in speaking.
*as indicated by the negative sign in the Pearson Correlation value
**shown in the two-tailed significance value

TESTING FOR ASSOCIATION THROUGH BIVARIATE AND MULTIVARIATE STATISTICS


Testing for association using the Regression function

When your analytical interest is to examine variables that predict a dependent variable, Regression should be
performed. PSPP can perform two kinds of regressions: linear regression and binary logistic regression.

Step 1: Recode categorical variables. Prior to performing regression, make sure that categorical variables are recoded
as dummy variables. This is done by recoding the variable such that its categories are valued at 1 and 0. For example,
you can recode RELIGION into the following dummy variables:
- RELIGION_CATHOLIC, in which 1 means BEING CATHOLIC and 0 means not being catholic
- RELIGION_ISLAM, in which 1 means BEING MUSLIM and 0 means not being Muslim
- Etc.

Perform the recoding such that all categories of variables to be included in the regression have been dummied. Try to
create dummy variables for sex (v4) and religion (v9).

Step 2: Click ANALYZE > REGRESSION. After recoding


dummy variables, you can start the regression
analysis. Under ANALYZE in the menu bar, select
REGRESSION, revealing two regression tests
available in PSPP. For now, let’s select LINEAR.

Step 3: Click REGRESSION > LINEAR. After selecting


Linear, the Regression dialogue box will appear.
Among the list of your variables, identify the
dependent variable and the independent variables,
including the dummy variables you just recoded.

Let try to examine the mean competence score for


speaking (tsk_spk) as the dependent variable, and
demographic variables age (v5), highest educational
income (v7), monthly household income (v10),
dummy variable MALE, and dummy variable
CATHOLIC as independent variables.

You may select the outputs you want to be displayed


upon performing the regression test. By default,
PSPP will display the R value, ANOVA table, and
coefficients. These are all the information that you
will typically be interested in.

A Primer on Communication and Media Research 111


Step 4: View the results. After loading the dependent and independent variables, click OK or click PASTE to display
the SYNTAX of the Regression command. Running the results will display three tables:

The Model Summary Table (R value): This


table displays the R and R2 values. The
values under R represent the correlation
value. Meanwhile, the R square value
represents the amount of variation
explained by the independent variables
loaded in the regression test. As
indicated below, 12% of the mean
composite scores for competence in
speaking is explained by the variables we
loaded earlier.

The ANOVA Table: The ANOVA table


displays the level of significance of the
regression model generated. Displayed
under the Sig. column in the Regression
column is the significance value, and the
results indicate that the regression
model is significant, because its value is
less than 0.05.

The Coefficients table: Finally, the coefficients table displays the changes in the dependent variable predicted by our
independent variables (under the Beta column in standardized coefficients), as well as the level of significance of
these variables (under Sig). The results show that age, highest educational attainment, and monthly family income
significantly predict mean composite scores for competence in speaking. The biggest predictor, based on the Beta
values, is highest educational attainment.

Step 5: Report the results. The results may be written in this manner:

A regression test was performed to examine the relationship between the respondents’ sociodemographic
characteristics and their mean composite scores for competence in speaking. The test revealed that the respondents’
mean composite scores for competence in speaking were significantly predicted by their sociodemographic
characteristics, namely, age (β = -.06, p = 0.05), highest educational attainment (β = .32, p < 0.00), and monthly family
income (β = .08, p < 0.01). The regression model is significant (p < .001) and explains 12% of the change in mean
composite scores for competence in speaking.

112 © UP CMC Department of Communication Research


If your analytical interest is to examine significant differences between or among groups,
tests for comparison are to be performed. These tests cover the parametric and non-
parametric statistics, which are tests under the inferential branch of statistics.

COMPARISON OF GROUPS USING PSPP


A. One-sample t-test

Step 1: Click ANALYZE > COMPARE MEANS > One


Sample T-Test. Tests that analyze differences
between and among groups are found under
COMPARE MEANS in the ANALYZE option.

Step 2: Select the test variable and identify test You may also click OPTIONS to change the confidence interval level.
value. The test variable must be a continuous By default, this is set at 95% CI.
variable. The goal is to test whether the mean
score of the sample is close to the mean score of
the population of interest (test value). Transfer
this variable to the test variable box, and type in
the value to be tested and compared with the
mean score of the sample.

Let’s try to examine the mean score for


competence in communicating with family
(fam_gen), and let’s set the test value to 4.

Step 3: View the results. Click OK or view the syntax by clicking PASTE. After running the command, two tables will be
displayed in the OUTPUT viewer. The first table displays the descriptive statistics of the test variable.

Meanwhile, the second table tells use the t statistic (t), degrees of freedom (df), level of significance (sig), mean difference
from the test value, and the confidence intervals.

Step 4: Report the results. Results may be written in this manner:

Based on the results below, the sample had statistically higher scores than the normal scores of 4.0, t (1075) = 19.33, p =
0.000.

A Primer on Communication and Media Research 113


COMPARISON OF GROUPS USING PSPP
B. Independent samples t-test

Step 1: ANALYZE > COMPARE MEANS >


INDEPENDENT SAMPLES T-TEST. Click
Independent Samples T-Test to perform
the command.

Step 2: Select the grouping and test


variable(s). The grouping variable must
separate the sample into two, unrelated
groups. This means that the members of
each group are different and belong
exclusively in their respective groups. Let
try to use the variable sex (v4) as the
grouping variable, and the mean
composite scores for competence in using
relational strategies (st_rel) as the test
variable.

Step 3: Define groups. After identifying


the variables, define the groups in your
grouping variable. Click the DEFINE
GROUPS button, which will reveal a new
dialogue box. Select “Use specified
values:” and select the two groups using
the drop-down menu.

Click CONTINUE after defining the groups.

As t-tests are inferential statistical tests,


you may also change the confidence level
(CI) using the OPTIONS button. By default,
the CI is set at 95%.

Step 4: View the results. Click OK to run the independent samples t-test. The OUTPUT viewer will display two tables:
The group statistics table will display the mean scores of the identified groups.

114 © UP CMC Department of Communication Research


But we’re more interested in looking at the t-test results. Assuming that your data is normally distributed, and your groups
compose the population of interest in equal amounts, look at the first row (equal variances assumed) of the next figure.
The Sig (2-tailed) column will inform you if there are significant differences between the two groups.

Step 5: Report the results. The results may be written in this manner:

An independent samples t-test was performed and based on the results, there are no significant differences between
males (M = 4.40, SD = .81) and females (M = 4.47, SD = .80) with regard to their composite scores for competence in using
relational strategies, t (1073.00) = 1.25, p = .30.

A Primer on Communication and Media Research 115


COMPARISON OF GROUPS USING PSPP
C. Paired samples t-test

Step 1: ANALYZE > COMPARE MEANS > PAIR


SAMPLES T-TEST

Step 2: Select the variable pairs to be tested.


The paired samples t-test compares mean of a
measure from the same set of respondents. This
means that the paired samples t-test is performed
to measure pre- and post-test score differences.
In the paired t-test dialogue box, select first the
pre-test variable, followed by the post-test
variable.

Click the OPTIONS button to modify the CI level.

Step 3: View the results. Click OK to run the independent samples T-test. You may also view the SYNTAX by clicking PASTE
prior to running the command in the SYNTAX editor. The OUTPUT viewer will display three tables:
The paired sample statistics table will display the mean pre- and post-test scores.

The second table displays the level of correlation between the variable pairs.

Finally, in the paired samples test table, look at the mean column, which tells us the average difference between the
variable pairs, and the level of significance, which would tell us if such difference is statistically significant.

Step 4: Report the results. The results may be written in this manner:

Based on the results of the paired samples t-test, the post test scores increased by 0.04 after the introduction of the
intervention, and the difference from the pre-test scores is significant (t1016 = 3.68, p < .001)

116 © UP CMC Department of Communication Research


COMPARISON OF GROUPS USING PSPP
D. One-way ANOVA

One-way analysis of variance (ANOVA) tests for differences for three or more groups.

Step 1: ANALYZE > COMPARE MEANS > One-way


ANOVA

Step 2: Select the dependent and factor variable.


The dependent variable must at least be an
interval-level variable, while the factor must be a
categorical variable with more at least three
categories.

In the statistics box, check descriptives to reveal


the mean scores and standard deviations of the
categories. This also helps you describe the
sample. The homogeneity option, meanwhile, will
display the Levene Test of Homogeneity results for
variance of the groups.

Step 3: View the results. Click OK after selecting the variables and the ticking the descriptive statistics option. You may also
view run its SYNTAX through PASTE.

The output window will display two tables. The descriptive table will reveal the descriptive statistics per category on the
specified independent variable.

Meanwhile, the ANOVA table will reveal if there are statistically significant differences between the groups. Our results
indicate that there are no significant differences when it comes to scores in competence in writing among various religions
in the sample.

There are other options available for one-way ANOVA that can only be performed using syntax. You can view which groups
are statistically different by adding the following line in the syntax of your ne-way ANOVA
/POSTHOC=TUKEY

A Primer on Communication and Media Research 117


This will reveal the multiple comparisons table, as shown below:

Alternatively, you may select other posthoc tests to further analyze the results of the ANOVA.

/POSTHOC={BONFERRONI, GH, LSD, SCHEFFE, SIDAK, TUKEY, ALPHA ([value])}

Step 4: Report the results. The results may be written in this manner:

Based on the results of the ANOVA test, there are no significant differences among the religions when it comes to
competence in writing (F1056 = 1.17, p = .321)

118 © UP CMC Department of Communication Research


TESTING FOR COMPARISON OF GROUPS THROUGH NONPARAMETRIC INFERENTIAL STATISTICS
A. Chi-square goodness of fit

Step 1: ANALYZE > NON-PARAMETRIC STATISTICS


> CHI-SQUARE

Step 2: Select the test variable. Select the


categorical variable to be tested. It could be
dichotomous, nominal, or ordinal variables. Let’s
try SEX (v4). As we’re assuming equal proportions
of both males and females from the sample, let’s
keep the expected range and expected values as
is.

Step 3: View the results. The first table reveals


the observed distribution of the categories
(Observed N) as well as the expected distribution
(Expected N). The differences between them are
presented in the Residual column.

Meanwhile, the second table contains the level of


significance of this distribution. We can see that
the result is statistically significant.

Step 4: Report the results. The results of the Chi-square tests may be written in this manner:

A Chi-Square goodness-of-fit test was performed to examine the differences of distribution between the sexes, and the
results indicate that distribution of the sexes was not equal in the sample, X2 (1, N=1075) = 64.34, p < .001.

A Primer on Communication and Media Research 119


TESTING FOR COMPARISON OF GROUPS THROUGH NONPARAMETRIC INFERENTIAL STATISTICS
B. Mann-Whitney U

Step 1: Select the variables. Mann-Whitney U is the non-parametric equivalent of the independent samples T-test. Hence,
prior to running the test, identify the test variable, as well as the grouping variable. Note the variable names as well as the
values assigned to the binary categories in grouping variable.

For example:
- Test variable: mean composite scores for competence in explaining/reasoning (tsk_exp)
- Grouping variable: sex (v4); values: 1 = Male, 2 = Female

Step 2: Open the syntax window. Unlike other


commands in PSPP, the Mann-Whitney U can only
be performed using syntax. Click FILE > NEW >
SYNTAX to open the syntax window.

Step 3: Type the code. Type the following template


to run the test.

/MANN-WHITNEY = var_list BY var (group1,


group2)

Where: var_list refers to the variable name of the


test variable var refers to the grouping variable
group1 and group2 refers to the values of the
categories.

Your code should be similar to the screenshot on


the next page:

120 © UP CMC Department of Communication Research


Step 4: Run the code. Highlight the command and
click Run > Current line.

The results yield the following tables:

Our value of interest here is the asymp. Sig. (2-tailed) to see if there are significant differences in the data. Evidently, our
results indicate that there are no significant differences between males and females when it comes to competence in
explaining/reasoning, given that the results are greater than the cut-off score of .05.

Step 5. Report the results. The results of the test are parallel to how we write the independent samples T-test:

A Mann-Whitney U test was performed to examine differences in reasoning competence between males and females. The
results show that there is no significant difference between these two groups (U = 133846, p = .74).

A Primer on Communication and Media Research 121


TESTING FOR COMPARISON OF GROUPS THROUGH NONPARAMETRIC INFERENTIAL STATISTICS
C. Wilcoxon Test

Step 1: ANALYZE > NON-PARAMETRIC STATISTICS


> 2 RELATED SAMPLES

Step 2: Select the variable pairs. Wilcoxon T is the


non-parametric equivalent of the paired samples
t-test, so it will require you to have a pretest-
posttest study design. Select these variables in
your list and check the Wilcoxon box under test
type.

Step 3: View the results. Click OK to run the command or PASTE to view the syntax and run the command from the syntax
editor. The OUTPUT viewer then displays two tables. The first table tells us the following information:
- negative ranks are respondents with a lower post-test score
- positive ranks are respondents with a higher post-test score
- ties are respondents with the same pre- and post-test scores

Meanwhile, looking at the asymp. Sig. (2-tailed) row, the test statistics tells us that there are statistically significant
differences between the pre- and post-test scores for competence in communicating with friends. We report the Z value
for the Wilcoxon T.

Step 4: Report the results. The results of the Wilcoxon test may be written in this manner:

Wilcoxon Test indicate that post-test scores for competence in communicating with friends were significantly higher than
pre-test scores (Z=2.48, p < .05).

122 © UP CMC Department of Communication Research


TESTING FOR COMPARISON OF GROUPS THROUGH NONPARAMETRIC INFERENTIAL STATISTICS
D. Kruskal-Wallis H

Step 1: Click ANALYZE > NON-PARAMETRIC


STATISTICS > INDEPENDENT SAMPLES

Step 2: Select the variables. Kruskal-Wallis H is


the non-parametric equivalent of the one-way
ANOVA. Select a categorical grouping variable, as
well as the test variable/s, in the dialogue box
that looks like the screenshot here:

Step 3: Define the groups. After identifying the


variables, click the Define Groups button to select
the range of categories to be included in the
analysis. Click CONTINUE after setting the range.

Step 4: View the results. Click OK to run the command directly or PASTE to generate the syntax in the SYNTAX editor. Two
tables are generated in the OUTPUT viewer. The first table shows the mean ranks per category.

Meanwhile, the test statistics table presents the 𝜒2 value (Chi square), degrees of freedom (df), and the significance level
(Asymp. Sig). We see here that there are no significant differences among the groups, given the significance value.

Step 5: Report the results. Results of the Kruskal-Wallis test may be written in this manner:

There was a significant difference on the use of confrontational strategies among respondents with different marital status
(H(4) = 2.72, p < .001), with a mean rank of 555.9 for single respondents, 507.43 for married (consensual) respondents,
530.85 for married (legal) respondents, 500.83 for separated respondent, and 529.49 for widowed respondents.

A Primer on Communication and Media Research 123


TESTING FOR COMPARISON OF GROUPS THROUGH NONPARAMETRIC INFERENTIAL STATISTICS
E. Friedman

Step 1: Click ANALYZE > NON-PARAMETRIC


STATISTICS > K RELATED SAMPLES

Step 2: Select the variables. The Friedman test is


the non-parametric equivalent of the repeated
measures ANOVA test. Hence, the variables must
satisfy the conditions needed for this kind of test.

Select these variables in the dialogue box that


looks like the screencap here:

After selecting the variables, check the Friedman


box under test type.

Step 3: View the results. The first table displays


the mean rank for each of the groups per
condition.

Meanwhile, the second table, which is typically


reported, displays the test statistic 𝜒2 value (Chi
square), degrees of freedom (df), and the
significance level (asymp. Sig).

Step 4: Report the results. Results of the Friedman test may be written in this manner:

A Friedman test was conducted to examine the effect of (IV) on (DV) in three experimental conditions. The results
rendered a chi-square value of 819.09, significant at the p < .001 level.

124 © UP CMC Department of Communication Research


15.3. Interpreting Findings

There are three levels of data interpretation: table or matrix reading, linking results to the study
framework and relevant literature, and drawing conclusions, implications, and recommendations.
This section will further elaborate on how you may interpret the results of your quantitative data
analysis.

• Level 1 interpretation: Reading from output tables and reporting statistical test results

Quantitative analysis is geared towards empirically validating what we theoretically know


about an objective reality, which are typically stated as research hypotheses. Research
hypotheses are propositions about some association between variables or some variable
comparison between or across groups. These propositions are the subject of statistical
testing. At the most basic level of interpretation, you have to read and report the results of
the statistical output tables that you were able to generate after running statistical tests.

Your reading of statistical tables and reporting of results will depend on the kind of test that
you performed, as shown in the step-by-step explanation of the tests in the previous
sections. In general, your report primarily has to present the result of your test for
association or test of comparison. Write-up for bivariate test of association should report
results on the
a. Presence of association (φ, V, λ, G, rs,, r),
b. Strength of association,
c. Pattern of association if the set contains nominal level variable/s or direction of
association if the set contains at least ordinal level variables, and
d. The significance of results (p-value). Meanwhile, the write-up for multivariate test of
association should report results of:
- Multiple correlation (R2) and its significance (p-value), and
- The standardized coefficient (ß) and corresponding significance (p-value) of each
independent variable included in the model.

Write-up for test of comparison using parametric statistics should report results on the 1)
mean (M) and standard deviation (SD) score of each group included in the comparison, 2)
results of parametric test (t, F), and 3) the significance of results (p-value). Meanwhile, write-
up for test of comparison using nonparametric statistics should report results of
nonparametric test and the significance of results (p-value).

• Level 2 interpretation: Reporting whether statistical test results provide evidence that
support or fail to support the claims of research hypotheses

Research hypotheses are propositions about what you theoretically know about an objective
reality and which you seek to validate through statistical testing. Your statement of research
hypotheses, which should have been guided by your study framework (which in turn is
informed by relevant theories, models, and existing literature), are subjected to statistical
testing. Based on the results of your statistical testing, you may or may not be able to find
significant evidence that support the informed propositions that you have articulated.

A Primer on Communication and Media Research 125


Level 2 interpretation links statistical results with your study framework and relevant
literature. Thus, you have to clearly articulate whether statistical test results support or fail
to support the claims that you have proposed in your research hypotheses (which typically
state some association between variables or some variable comparison between or across
groups). This is consistent with the goal of quantitative research of empirically validating
what we know about an objective reality. When statistical evidence supports the claim of
your research hypotheses, then your findings serve to empirically validate what has already
been known about an objective reality. Otherwise, your findings open an area that may be
further theoretically, conceptually, and operationally explored and subjected to significance
testing. This is how social science inquiry contributes to the pool of knowledge, as findings
that support or negate the claims of a theory or a model, some aspects of a theory or a
model, an integration of theories or models, or even postulations from previous studies may
either validate what we already know or direct us to areas that can be the subject of further
social scientific exploration.

Note, however, that in some quantitative studies that do not aim to test hypothesis, Level 2
interpretation is done to link the quantitative findings to the literature and theory.

• Level 3: Drawing conclusion, implications, and recommendations

In level 3 interpretation, you present your conclusion, which is essentially a synthesis of your
findings as you address the main research question that you posed in your study.
Additionally, you address the theoretical, methodological, and practical implications of your
findings and offer theoretical, methodological, and practical recommendations based on
these findings.

15.4. Managing quantitative data

Now, how can we manage data from content analysis and experiment? In doing content analysis,
researchers describe and systematically analyze messages from a source and its characteristics.
In experiments, researchers manipulate interventions/stimuli (or factors) and examine its effects
on audiences. Now that you know how to do quantitative analysis, specifically using survey
research data, you might be wondering how to do it when using other types of datasets. The next
section discusses and demonstrates the common tests used in analyzing data collected through
content analysis and experiment.

15.4.1. Content analysis data

More often than not, content is described nominally and ordinally. For example, in examining
print news articles, researchers look at placement of news article (whether it’s in the front page,
sports section, business, etc.), which is a nominal variable. Tone is usually categorized as either
positive, negative, or neutral, an ordinal variable. Another common variable is prominence,
which could be measured in terms of fold placement, e.g., upper fold or lower fold, an ordinal
variable. There are a few variables which operate at the interval/ratio level. For example,
prominence may be measured in terms of article size, e.g., dimensions in centimeters, number of
words.

Given the nature of content analysis data, a researcher cannot hypothesize that variables
examined have linear and direct relationships among them. Typically, content data can be
analyzed through descriptive statistics.

126 © UP CMC Department of Communication Research


Another way of examining message characteristics is to check the intersections between two
categories, which can reveal message patterns.

Researchers can test message characteristics a bit further by examining the significant
differences between and among the categories of variables. For example, a research may
hypothesize that there are significant differences in terms of articles size (DV) among article
genres (IV). In testing message characteristics of two groups, Independent Samples t-Test or
Mann Whitney U may be used. Meanwhile, One-way ANOVA and Kruskal Wallis H can be used to
examine differences in interval/ratio level message characteristics among three or more groups.

15.4.2. Experiment data

Unlike content analysis data, research utilizing experiments always hypothesize relationships
among the variables and use inferential statistics to test such hypotheses. Inferential statistical
tests such as t-tests and ANOVA are commonly used, depending on the design of the experiment.

15.5. Writing quantitative research reports

Numbers tell a lot, but it is important to be able to unravel the story from the numerical data.
Effective reporting of quantitative research findings helps readers make sense of the statistical
results.

Quantitative research findings follow a strict research protocol, which leaves zero to minimal
rooms for error. Deviations must be reported, and every detail must be accounted for and
declared. Thus, quantitative researchers must ensure transparency in reporting. Norris, Plonsky,
Ross, and Schoonen (2015) suggest the following details when writing quantitative research
studies:
- Describe the population of interest in detail, including their key characteristics
- Narrate the specific details of the sampling, which include the sampling method, recruitment
details, incentives (if any), response rates, attrition rates, group assignments, and bases for
the sampling size
- Describe the instrument and procedures for data gathering, scale construction, index
computation, coding, and scoring
- Provide evidence of reliability (through computation of reliability scores per concept, as well
as the overall internal consistency; for content analysis, provide the inter-coder reliability
scores) as well as the validity of the instrument (results of pre-testing and previous literature)
- Explain the research design used for the study and its appropriateness given the study’s
objectives. Describe the study site and specify factors or conditions which may be unique in
the area. Specify how variables were treated in the study, and they were manipulated and
controlled (for experiments)
- Present the complete statistical findings and key statistical information, such as level of
statistical probability, mean scores and standard deviations, as well as test-specific scores, as
indicated in the steps outlined in the previous sections

Being highly detailed in documenting helps your readers understand the various nuances of your
study and ensures that the study can be replicated by other researchers. Moreover, it allows for
accurate interpretation of the research results.

One of the qualities of quantitative research is the generalizability of its research findings,
following proper probability sampling procedures with sufficient/representative sample. Thus, in
drawing out generalizations from the quantitative findings, researchers must exercise caution
and precision. This can be done by using qualifying language, as shown in the following examples:

A Primer on Communication and Media Research 127


Based on the findings of the study, Filipino men are more likely to perceive themselves as competent
in making jokes, as compared to women.

Another way of qualifying research claims is to show its limitations, based on the sampled
cases, context of the study, and supporting literature. Here is one example:

Filipino men living in Metro Manila seem to see themselves as good presenters. However, this may not
be the case for those living outside the region.

The use of cautious language ensures that your readers do not misinterpret your research.
By being transparent in your writing and cautious in your language, readers can derive more
meaningful interpretations from statistics and be better informed by quantitative research
data.

128 © UP CMC Department of Communication Research


16. MIXED METHODS ANALYSIS

16.1. Overview

16.1.1. Review of methods

To get started with mixed methods research, let us first review our classification of methods and
the type of data we generate from each method as discussed earlier. Our classification system
arrays methods according to paradigms and topics.

By paradigm, we can categorize methods into positivist (content analysis, survey, and
experiments) or interpretivist (textual analysis and ethnography). Depending upon its framework,
reception analysis or case study, can be positivist, interpretivist, or multi-paradigmatic.

By topic, we can group methods according to their unit of analysis. If our study examines sources
and receivers, then we can implement it using surveys, experiments, reception analysis, or
ethnography. If a study examines messages, then we can conduct content analysis or textual
analysis. We can use case study, if we are looking at different types of messages, sources, and
receivers to inform our research.

When doing mixed methods, we can thus conduct


- A study within a paradigm but with different units of analysis (e.g., a study that subscribes
to the positivist paradigm that employs content analysis of newspaper articles and a survey
of their readers)
- A study across paradigms with the same units of analysis (e.g., a study that uses
quantitative content analysis and qualitative content analysis of the same newspaper
articles)

You can employ many other combinations depending upon the needs of your research.

16.1.2. Benefits and challenges of mixed analysis

Doing mixed methods research is naturally more difficult than doing a project with only one
method. So, why do it? There are reasons (Cathain, Murphy, & Nicholl, 2007; Small, 2011) and
challenges (Johnson & Onqwuegbuzie, 2004) in doing it.

• Complementarity and Comprehensiveness

Data which we collect or construct using only one method have inherent strengths and
weaknesses. Mixing methods enables us to address the weaknesses of any given method.

Consider the methods with the positivist and interpretivist paradigms. Mixing methods
within the same paradigm generates data which cover not only messages but also their
producers and receivers. The analysis of such data thus gives a holistic picture of a given
phenomenon. Meanwhile, mixed methods across paradigms (whether within or across
messages or sources/receivers) generate data which provide a multi-faceted understanding
of our research topic.

A Primer on Communication and Media Research 129


• Confirmation

Through the analysis of mixed methods data, we can confirm relationships we cannot
otherwise establish using data from only one method. The Agenda-Setting Theory, for
instance, argues that the media define what people talk about.

• Commensurability

Whereas complementarity, comprehensiveness, and confirmation are the benefits of


analyzing data from a mix of methods, this fourth C points to a philosophical issue that
underpins the analysis of data from methods which are informed by different paradigms.
You may recall each paradigm represents a view about reality and comes with its own
approaches to study that reality.

In doing multi-paradigmatic message analysis, for example, we must contend with the
different approaches of content analysis and textual analysis. Content analysis is theory-
driven and deductive in approach, whereas textual analysis is data-grounded and inductive.
The question which emerges, therefore, is one of commensurability. This refers to how we
marry otherwise competing arguments about reality and its study. If we fail to address this
issue of commensurability, then our analysis remains suspect as it has no clear or solid
philosophical foundation.

16.1.3. Considerations in doing mixed method analysis

In doing mixed methods research, we need to consider the following items which factor in the
conceptualization of the research problem, the collection of data, and the analysis and
interpretation of data. Let us list these considerations first before we locate them in the
analytical processes of mixed methods research in the next section.

• The framework

Mixed methods research typically entail the integration of theories. A multi-topic (messages
plus sources/receivers) but single-paradigmatic research problem requires the combination
of a message-centric theory plus a human-oriented theory.

• The nature and number of data types

Mixing methods entails at least two sets and/or types of data. A single topic, but multi-
paradigmatic research, meanwhile, will have two types and two sets of data. Our message
analysis example, for example, will have, on the one hand, qualitative data from the textual
analysis and, on other hand, quantitative data from the content analysis.

• The method timeline

There are two general types of mixed methods research design—concurrent or sequential—
which are distinguished by the timing between them (Small, 2011).

As the name indicates, a concurrent design means methods are implemented at the same
time. Thus, data are collected, and subsequently analyzed, simultaneously.

130 © UP CMC Department of Communication Research


In comparison, sequential design involves the implementation of one method before the
other. In this case, the data from the first method are analyzed first. This initial analysis then
informs the conceptualization and implementation of the second method. A sub-category of
sequential design is called nested design, in which participants from the first method are
asked again to join the second method in the research project.

We elaborate on the method timeline in the next section.

16.2. The Analytical Process

The foundation of mixed methods analysis begins with the statement of the problem and
objectives. It is in the problem where we articulate whether the project covers several topics
(sources, receivers, messages) and/or paradigms (positivist/interpretivist). We then develop our
study framework depending upon the requirements of our problem. Our methodology
subsequently aligns with the framework to ensure that we gather the data that answer our
problem.

16.2.1. Single-paradigmatic mixed methods

If your project involves only one paradigm, then the analysis is quite straightforward. Previous
sections detail the principles and procedures for analyzing data in positivist-quantitative and
interpretivist-qualitative studies, respectively.

The additional challenge in our case is that we are now dealing with two sets of data which come
from two methods which comprise our methodology. This methodology, in turn, aligns with our
framework and research problem and objectives.

• Concurrent

In a purely positivist study, for example we are going to have one set of data from our survey,
and another from our content analysis. If we are to implement the two methods
concurrently, then we need perform the quantitative analysis as prescribed in an earlier
section in this resource material. Do recall that in positivist research, we maintain an
independent stance relative to our dataset; thus, the use of the third person perspective
(“the researcher”) from the beginning to the end of the paper.

In a purely interpretivist study, for example we are going to have one set of data from our
textual analysis and qualitative reception analysis. If we are to implement the two methods
concurrently, then we need perform the qualitative analysis as prescribed in an earlier
section in this resource material. Interpretivist research requires our close interaction with
the dataset. This includes memoing our reflexivities as we make sense of our data. As the
analysis is strongly grounded on our insights and interpretation, then we use the first-person
perspective (“I”) throughout the paper.

In either positivist or interpretivist approach, we are going to need to decide how to present
our data which we collect or construct simultaneously. The straightforward answer lies in
our stated research problem and objectives. We organize our analysis according to the order
or logic of our objectives. Thus, if the first and second objectives are addressed by the first
method, then we first present the data from that method. We then present the data for the
other objectives accordingly.

Once we have analyzed and presented these data sets according to our objectives, then it is
time for interpretation, which we are covering in the next section.

A Primer on Communication and Media Research 131


• Sequential

Timing is the defining element of this approach. In a project with a sequential design,
analysis is correspondingly multi-staged. The analysis of the results in the first method
informs the design and implementation of the second method.

We may want, for instance, to perform content analysis first to determine the breadth of
mediated messages. We then use the findings in a subsequent survey instrument where we
ask respondents about their awareness, knowledge, attitude, and practices regarding these
mediated messages.

Similarly, we can first immerse ourselves in mediated messages through a textual analysis.
Once we have surfaced our own insights and reflexivities about these messages, then we can
meaningfully engage viewer-informants about their own understanding and meaning-
making about them.

16.2.2. Multi-paradigmatic mixed methods

This mixed methods design poses additional challenges because we are now contending with two
paradigms and raises the issue of commensurability. How can we integrate data which are
collected and constructed according to the positivist and interpretivist paradigms respectively
when there are philosophical differences in how each paradigm views and studies reality?

But let us not get into this intense debate now, as this is thoroughly covered in our earlier
sections. Let us focus instead on how such philosophical contentions translate into the practical
aspects of analysis. Perhaps the easiest way to explain this issue is in terms of which person-
perspective we use in our project. Do we use, for instance, the positivist “the researcher” or the
interpretivist “I”? In deciding which person-perspective to use we also indicate the overall anchor
paradigm of our research.

However, you may ask, what is our basis for this decision?

The simplest answer is which of the project components primarily answers our research problem
and which one provides complementary or confirmatory answers. Thus, in our analysis, we first
present the findings of the primary method, followed by the complementary or confirmatory
method. If we backtrack a bit in the research process, the order of our research objectives must
also signify this hierarchy.

16.3. Interpretation principles for mixed method studies

Once we have performed the analysis required by our mixed methods project (i.e., whether it is
single or multi-paradigmatic, concurrent, or sequential), our next step is to interpret our data.

Data interpretation has also been covered earlier in this primer. In this section, we only discuss
the principles of mixed methods data interpretation. In the next section, we demonstrate how
these principles have been applied in previous studies.

132 © UP CMC Department of Communication Research


16.3.1. Reading across data

• Reading objectively and subjectively

Earlier we discussed the issue of commensurability because of the philosophical differences


in positivist and interpretivist research. Specifically, positivism looks at the world objectively,
while interpretivism looks at it subjectively.

In data interpretation, however, the divide between positivism and interpretivism blurs
significantly. It is because when we interpret data, and especially so in multi-paradigmatic
studies, we draw from our objective and subjective worldviews. To make sense of data, we
draw from our subjective and contextual experiences—on our own or with other people—as
well as otherwise objective or non-contextual information. It is we, as researchers, who
make sense of the data by connecting findings to each other.

• Reading for breadth and depth

Multi-paradigmatic mixed methods projects enable us to study a phenomenon extensively


and intensively. Positivist data gives us the big picture of a phenomenon. And if our positivist
research abides by probability and randomization principles, then we can generalize our
findings from our sample onto our overall population. At the same time, interpretivist data
allows us to focus on a detail of that picture which requires thorough investigation. Cathain,
Murphy and Nicholl (2007, p. 87) write, “A quantitative method can help to generalize a
qualitative study… (while) …. Qualitative methods can be used to consider the results of a
study and their application within a real-world context, drawing on pluralistic views of
different stakeholders.”

In sequential design where we do interpretivist research first, then we can surface themes
which we can then test in a positivist study. In this case, our deep investigation of the
occurrences in a phenomenon serves as the foundation upon which we can then proceed to
describing and explaining the extent to which these occurrences recur.

• Reading for confirmation and disconfirmation

Ideally, in single-paradigmatic studies, our data confirm each other. How does this work in
positivist research? In practice, it means, for example, that our content analysis data align
with our survey data. That what is shown prominently on television, for instance, is also
what people say they watch. How about in interpretivist research where we do textual
analysis and ethnography? It means the way we make sense of a television show jells with
the way it is understood by our informant-viewers. For example, we and our informants may
both surface the idea that a television show which supposedly presents feminist ideals
rather stereotypes women into simplistic categories. In either of these cases, our multiple
data sources confirm each other, thereby strengthening our singular argument.

One risk and challenge with mixed methods, research, however, is that data from our
multiple activities do not confirm each other—that our audiences do not report what the
television show contains or that we and our informants have different perspectives about
the show. The problem is less severe in interpretivist research than in positivist research
because interpretivism provides for subjectivity. Insights and interpretations may not agree
between informants and researchers so long as there is rigorous reflexivity and
intersubjectivity among all participants. This means we may not agree with each other’s

A Primer on Communication and Media Research 133


claims, but we understand and respect the process we went through in arriving at these
arguments.

The problem is grave with positivist research. When findings do not align with each other
and especially when hypotheses are not confirmed, questions arise whether the literature
has been reviewed correctly, the framework has been operationalized appropriately, and the
design has been implemented correctly. Indeed, researchers must backtrack to the
conceptualization and implementation stage to offer reasons regarding the disagreement in
the data.

In multi-paradigmatic research, it may appear sometimes that data do not agree. For
example, we have a reception analysis project for a news television show. We then conduct
a probability survey and a series of focus interviews. Our findings indicate half of all survey
respondents perceive the show to treat news sensationally by reporting violent crime
emotionally. However, only nine (or a quarter) of our 36 focus interview informants believe
the reporting to be sensationalistic. You may say our findings do not confirm each other.
While it may seem to be the case, let us return to the paradigm and corresponding sampling
logic behind our methods.

Our survey’s findings are generalizable because it abides by randomization principles. It


means, indeed, half of viewers, within the appropriate margin of error, share the perception
that the show is sensationalistic in its presentation of the news. Our survey, being positivist
in nature, is after recurrence.

In comparison, our informants have been selected purposefully, specifically through


maximum variation sampling according to their age, gender, and income. The informants
have been chosen to represent specific profiles, not to represent the general population
itself. By identifying such profiles, we seek to surface all possible nuance in the
understanding of the program’s presentation of news. This is in line with the goal of
interpretivist research to explore all occurrences, regardless of how frequently or
infrequently each occurrence recurs.

In the end, therefore, we must remember that our data come from different paradigms and
must be interpreted accordingly.

16.3.2. Linking to theory

The reminder regarding the inherent differences in paradigms is also important when we
interpret data from single-paradigmatic or multi-paradigmatic projects. As you may recall in
previous lessons, positivism is deductive while interpretivism is inductive in nature. We then
interpret data accordingly if we only subscribe to only one paradigm in our study.

What do we do then for mixed methods research? In this case, we use abduction, which is the
process of using both induction (surfacing of patterns) and deduction (testing of hypotheses) to
proffer the best interpretation of our data (Johnson & Onqwuegbuzie, 2004, p. 17). The process
of abduction is informed, meanwhile, by the concurrent and sequential design of our mixed
methods project. If the project is sequential in design, then the inductive or deductive approach
to analysis and interpretation informs each stage. In comparison, if the project is concurrent in
design, then we take our cue from our primary approach as we have discussed earlier in this
chapter.

134 © UP CMC Department of Communication Research


16.3.3. Providing implications

You may also recall that the differences between paradigms pertains to the relationship between
us and our surroundings. In positivism, an external objective reality determines our knowledge,
attitude, and behavior, among others. This sense of determinism is grounded on the positivist
orientation towards cause and effect. In interpretivism, meanwhile, our own subjective realities
comprise a shared or constructed reality. This sense of voluntarism, meanwhile, is founded on
the interpretivist orientation towards personal agency.

It is important for us to remember this when thinking of and articulating implications from mixed
methods research. If our reception study on the television news show were only positivist in
nature, for example, we can propose a top-down behavioral campaign to mitigate the potential
impact of sensationalistic news delivery. This is in line with the deterministic underpinning of
positivist research. In comparison, if the study were solely interpretivist in nature, then we can
suggest bottom-up participatory activities where people can better make sense of their own
understanding of sensationalistic news. This, as you can see, is in line with the agentic argument
of interpretivist research. If our study were multi-paradigmatic, then we can make either or both
recommendations. How we prioritize each recommendation depends on the paradigm of our
primary approach or on other more practical considerations such as resource and logistical
concerns.

A Primer on Communication and Media Research 135


17. RESEARCH REPORTING FOR ACADEMIC AUDIENCES
Associate Professor Jonalou S.J. Labor, PhD

17.1. Overview

The research report is one of the highlights of your project. It tells your readers the reasoning,
the procedures, and the results of your research project. It states the recommendations and
implications of your findings. The research report must be engaging and of high quality because it
communicates the value and impact of your research to your audiences.

The report is “a systematic write up on the findings of the study” (Kabir, 2016, p. 501). It is also a
“formal account of how a research project was conducted and what is found out” (Thomas and
Hodges, 2010, p. 1). As such, it requires writers to be able to walk their readers through the
nature, process, and results of the research project.

The report is a record of a project for readers to use, assess, and incorporate in their own
research projects. It is used to help readers discover existing and new answers about a
communication phenomenon. It assists your fellow researchers in creating their own research
projects on a similar topic. Other researchers may also critique your report, particularly in terms
of its depth and breadth of analysis.

What are the criteria for a good report? A good report is:

a. Informational—A good report provides readers with the necessary material and detailed
information about your research.

b. Instructional—It contains the methodological and analytical procedures for doing a research
project on a specific topic. These procedures may be instructional for researchers doing
research in the same area.

c. Problem-solving—It clearly articulates and answers an important research question in


communication and media.

d. Evidence-based— It contains the empirical basis for your results, summary, conclusion,
implications, and recommendations.

e. Persuasive—It must be able to sell an idea. It must argue that it has come up with a solution
to a research concern.

Researchers like you must also assume the role of a writer. As such, you must be able to take the
challenge of a) explaining the bases of your project in literature and theory, b) highlighting the
rigor of your research, c) ensuring the correctness of your findings, and d) convincing your
readers about the validity of your interpretations.

136 © UP CMC Department of Communication Research


The value of a research report also depends on the navigability of its content. Here are some tips
in structuring and filling-in the contents of your report:

a. In the beginning of your report


- Begin by building the importance of your research investigation
- Pinpoint how your study adds to the understanding the communication or media
phenomenon you are examining
- Discuss how other researchers have studied the topic in the past
- State your problem and objectives as informed by previous research and the state of the
phenomenon you are studying

b. In the middle of the report


- Convince your readers that your selected theories inform your assertions
- Persuade them that your assumptions or hypotheses stem from your framework
- Tell your readers how you collected the data through a series of scientific steps and
procedures
- Provide a scientific and logical way of discussing the results.

c. In the end of the report


- Summarize and conclude the study
- Provide theoretical, methodological, and practical implications and recommendations

17.2. Understanding the types of research reports and their academic audiences

The report has a lot of potential readers or audiences who should be in your mind when you
begin thinking about writing your report. Here are some suggestions:
- The writing of the report must be able to influence the thinking and behavior of individuals
who share your research interests.
- The writer should be able to articulate the content of the report, convince the readers that
the analyses are correct, and that the recommendations would be beneficial to concerned
audiences.
- The research report is written for an audience that wants to be educated, inspired, and
helped in expanding their intellectual curiosity.

The entire research must be reported comprehensively, systematically, and completely. It must
be written according to the style, conventions, and expectations of the target audience. The
academic community, for instance, has standards when it comes to the content and form of a
research report. These standards help readers to comprehend the report easily and quickly.

Members of the academic community look for a manuscript that is readable and clear. They
want a cohesive report, the narrative of which follows a smooth progression. This means that
writers like you must provide a logically organized report that guides its readers in absorbing its
message.

A Primer on Communication and Media Research 137


17.2.1. Academic audience 1: Teachers, panel members, and students

Readers of communication and media research reports in the academic community include those
who study and teach communication, media, and courses in the allied fields. These readers
conduct basic and applied studies as a way to contribute to the growth of the discipline. Thus,
they read reports to
- Know the trends and research gaps in the discipline
- Understand the objectives, methods, and findings of previous research
- Assess the quality of the previous studies to make informed decisions about their own
research agenda and projects
- Incorporate the theoretical, methodological, and practical recommendations from previous
studies to their own research

The research teacher may also function as a research adviser. The adviser reads the thesis paper
to
- Assess the consistency and fit of the research problem to the framework
- See how the research problem has been addressed by the analysis of the collected data
- Examine how the recommendations and implications of the study link back to the
framework and the methodology

Panel members are also readers of the thesis paper. They comprise a community of experts
whose task is to examine the soundness of, and perhaps even strengthen, your research
arguments. For their part, panel members:
- Look into the conceptual clarity, significance, logical arguments, and soundness of the
proposed and final thesis paper
- Read the paper to gauge its merits and flaws
- Function as a collegial body to point out the shortcomings of the research proposal and its
implementation
- Ask questions and comment on the contribution of the research work relative to the body of
scholarly publications on the subject matter.
- Examine the clarity and substance of the report
- Provide constructive criticism to your report

The thesis paper

Most communication and media research teachers and students, especially in the undergraduate
level, read one type of research reports: the thesis paper. This is the penultimate written
requirement asked of the student. It is the product of critical and creative thinking and is
considered as the synthesis of the undergraduate academic life of the student. It is also a
theoretical argument of the researcher. Such an argument is based on a framework that is, as
previously mentioned, based on data and evidence.

Writing the thesis paper or colloquially the “thesis” requires effective writing where one needs
style, organization, strategy, purpose, and a thorough consideration of who the audience is. Just
like any other form of academic writing, the researcher begins with who the intended audience is
and what the purposes of writing are.

138 © UP CMC Department of Communication Research


A thesis manuscript is a detailed account of a student’s research project. It is supposed to be
tailor-fitted to an audience that needs the relevant information of the research output. The
academic background of the specific audience is a key consideration. This does not mean,
however, that the thesis should use terms that only the subject-matter experts understand. The
material should be presented in the simplest terms to avoid misunderstanding. Consistency in
argumentation, word choice, and tone is necessary. In short, the writing style is written with a
diversity of audiences in mind.

17.2.2. Academic audience 2: Academic conference organizers, journal and book editors and reviewers

Research reports can come in the form of academic papers such as conference papers, journal
articles, and book chapters in research anthologies. These reports are reviewed for their
significance to the field, their connections to the literature, the strength of their theory, the rigor
of their methodology, the correctness and comprehensiveness of their findings relative to their
problem and objectives, and the value of their implications and recommendations.

For conferences, selection committees typically select which conference papers are included for
presentation. Academic conferences generally require that submissions
- Meet requirements in terms of form and style, as stated in the call for papers
- Contribute theoretically, methodologically, and practically to the discipline
- Be relevant to the nature of the conference
- Be impactful
- Appeal to the audience

For journal articles and book chapters, editors usually serve as the gatekeepers who first check
whether manuscripts submitted to them can be sent for peer review. The reviewers or referees,
meanwhile, are independent experts who gauge the worth and value the manuscript Referees
are tasked to give authors a set of feedback for their articles to improve their work. The feedback
of the referees help editors to assess whether the paper is fit for publication.

17.2.3. Academic audience 3: External audiences

Academic audiences may also include program managers of communication and media units,
organizations, and institutions. These managers are in the position to create programs and
policies that are based on the findings and recommendations of communication and media
studies. These individuals may be fellow knowledge workers who can act on the findings of your
research.

Frontline professional practitioners comprise another type of external academic audiences.


These are people who live in communities and neighborhoods that may have been sourced as
study sites of the research. These could also be journalists or other media representatives who
may have special interest on the research topic and may be conceptualizing a TV, radio, film, or
new media feature or documentary about the subject matter. External audiences may also be
any member of the general public who may, out of curiosity, want to be informed of the details
of the research. They read research reports in order to find empirical evidence to communication
and media hypotheses and claims. They focus on the findings and its implications to the creation
of policy, laws, and regulations.

A Primer on Communication and Media Research 139


17.3. Components of the research reports

Writing is a craft that can be learned and refined through constant practice. The proverbial adage:
“practice makes perfect” is true for research reporting. Good writing is attained because writers
write often and learn from experience. Being critiqued is also part of the training. Thus, good
writing is a product of self-discipline, attention to detail, constant writing habit, and constructive
criticism.

People become good writers because they do not work in isolation. By constantly exposing
themselves to various written materials and research reports, they begin to see the style and
forms of various writers and authors. Students like you should also learn that submitting a “first-
draft paper” would not help in polishing your arguments, construction, and style.

Knowing what to write for which section of the research report is a step towards being a
competent research report writer. Here are some tips on the different parts of research reports.

17.3.1. Abstract

This is a descriptive and comprehensive summary of the report. It allows the readers to know the
research problem and objectives, study framework, methods and procedures, findings, and
conclusion of the study. It usually contains between 150 and 200 words. Abstracts provide a
preview of the full manuscript and helps to attract the interest of potential readers.

17.3.2. Introduction

This sets the tone of the manuscript as it presents the context and key assertions of the study. It
also discusses the history of, as well as trends about, the research topic. This is usually one-tenth
of the length of the entire paper.

The Thesis and Dissertation Guidebook of the University of the Philippines’ College of Mass
Communication (Paragas, et al., 2008) identifies the following parts of the Introduction:

• Background of the Study

This section contains an introduction of the communication or media concern that the work
seeks to discuss using historical and baseline data and quantitative insights. It also contains
an introduction and explanation of the chosen cases such as the media organization, the
population, or even the specific geographical area that would be used as “site” for the
communication or media concern.

• Statement of the Problem and the Research Objectives

This section contains a 100-word paragraph discussion of the nature of the research problem
that is followed by a clearly articulated set of research question and a series of general and
specific objectives.

• Significance of the Study

This section discusses the reasons why the study is being conducted. It provides the
theoretical, methodological, and practical purposes of the investigation. It provides an
overview of the implications and recommendations of the study.

140 © UP CMC Department of Communication Research


17.3.3. Review of Related Literature

This chapter presents previous studies published in journals, conference compendia, books,
research anthologies, and other academic publication.

The presentation is not an enumeration of past studies but rather a synoptic view of the
scholarly arguments of previous works that could serve as basis for the argument of the current
study. It identifies the gaps in the literature and explains how the current study addresses these
gaps. It covers each and every concept in the current research. The concepts are discussed
according to how they relate to the study objectives. It ends with a synthesis of the studies
reviewed.

17.3.4. Study Framework

This chapter presents the theoretical and conceptual foundations of the research. It discusses the
scholarly arguments of the theories and models that comprise the framework and inform the
research. It discusses the set of concepts that logically explains and/or predicts the relationships
of variables in a certain phenomenon.

In a thesis paper, on the one hand, positivist researches require three levels of frameworks.

At the theoretical level, the framework demonstrates how the theories guide the researcher in
constructing a parsimonious explanation of the concepts in the study. It discusses the theories as
originally argued by their original theorist/s. It also explains the strengths and weaknesses of the
theories as they comprise the framework.

At the conceptual level, the framework applies the concepts into the study. It provides a
justification of how the researcher’s model aligns with the original intent of the theories.

At the operational level, specific measures of the variables as applied in the context of the
research are arrayed in the framework.

On the other hand, interpretivist studies require the theoretical and the conceptual levels of the
Study Framework.

Both types of studies require a section on the operational definition of terms.

17.3.5. Methodology

This chapter describes the data gathering and analysis of research data. The sections are
discussed in Part Two of this Primer.

A Primer on Communication and Media Research 141


17.3.6. Results and Discussion

This chapter presents the findings of the study. The arrangement of the results follows the order
of the specific objectives.

• Quantitative results writing

On the one hand, writing quantitative results requires a good quantitative analysis and the
knowledge to clearly tell the story of numbers and statistical tests. Remember that rules
govern the way numbers and statistical tests are reported. For instance, never start a
sentence with a numeral. Note that numbers under 10 are usually written as words.
Reporting outcomes of statistical tests depend on the citation style that is prescribed by your
school or the institution that you want to publish your work.

• Qualitative results writing

On the other hand, writing qualitative findings depends on study’s research design and
methods. Creswell (2006) mentions different formats for reporting findings as the writing of
the manuscript is based upon the methodological approach of the study. For instance, a
phenomenological qualitative research report may look at the communication phenomenon
from the “I” perspective while the ethnographic research report may use a third person or
the “they” perspective.

What is important in the write up is that proper labeling of themes should be done to justify
these. Labeling includes the appropriate name and description of the categories. The
researcher must also include significant statements from the raw data to help the readers
understand the context of the theme. Including relevant quotations from the raw data also
allows the readers to identify with your line of argument and reasoning. Remember that
research participants must not be identified during the research report so it is important to
remove all details that may reveal their identity unless they consented that they could be
identified in the research report.

17.3.7. Summary and Conclusion

This chapter has two parts: summary and conclusion. The summary addresses the general
objective by explaining the key findings of the specific objectives. Explanations that are guided by
the theory and are related to previous studies create a compelling summary. The conclusion,
meanwhile, answers the research question.

17.3.8. Implications and Recommendations

This chapter answers the “so what?” question that the researcher asked during the start of the
research investigation. It provides details on the value of the research to theory. It compels
researchers to provide a sound discussion of how the conceptual framework looks like after the
data interpretation. It allows them to describe a new theory that has been developed from the
study.

142 © UP CMC Department of Communication Research


This component also provides a discussion of the methodological issues that have arisen from
the investigation. It explains the soundness of the methodology as well as the implications of the
data gathering process to the results of the research. It contains the recommended approaches
to future studies. It also discusses the applications of the findings to the improvement of certain
practices and policies. The discussion is prescriptive in tone, but still based on research findings.

17.3.9. Bibliography

There are a variety of citation or reference styles that may be used in writing the bibliography
(Swaen, 2019 via www.scribbr.com). The styles are a set of rules on how to refer to the sources
that were identified in the research report. All of them are used by writers to avoid plagiarizing
other people’s work. The most common citation styles in the field of communication and media
research are the American Psychological Association (APA) style, the Chicago Manual of Style,
and the Modern Language Association (MLA) style.

The APA Style is one of the most common styles in report writing. Originally used for the social
sciences, the style has been adopted by various disciplines because a lot of journals and book
publishers adhered to the style. In most universities and colleges that offer communication and
media research, the APA is preferred. You may access the latest APA Citation Style either from
the APA website of the OWL website of Purdue University.

The Chicago Style is used by writers in the humanities. Authors who want to publish in the areas
of literature, history, and the arts use this format. Unlike the APA, this style requires writers to
indicate the complete source in a footnote or an endnote and in a bibliography. You may access
the latest Chicago Style from its main website or the OWL website of Purdue University.

The Modern Language Association (MLA) style is used for publications in language studies. You
may access the latest MLA format from its main website or the OWL website of Purdue
University.

Citation styles differ in terms of how sources are written in the bibliography and how the sources
are cited in the text or within the manuscript. Remember that each citation style has a
recommended in-text (citation within a paragraph) format. Consistency is a must in using these
styles.

17.4. Key considerations in writing the research reports

17.4.1. Focus

A research report should have clear purpose and parameters. To ensure this, researchers must
always remember to address the research problem and objectives (RPO). While the data they
gathered or constructed maybe voluminous, always going back to the RPO means researchers do
not digress from their paper’s intent.

Moreover, researchers may feel that so much is asked regarding the format and tone of their
report. Creating an outline and sticking to the paradigm of the research help in ensuring
consistency in the style and the overall coherence of the report. Learning from the narrative
exposition of the theses, dissertations, and journal articles that you have read also helps you in
writing your own report.

A Primer on Communication and Media Research 143


17.4.2. Organization

Good writing is organized. There should be a logical presentation of ideas that ends in a
reasonable conclusion. Researchers must prune their ideas to arrive at a refined paper.
Researchers should thus begin with an ending in mind. Ask, for instance, “What conclusion do
you wish to support when you finish your research?”

17.4.3. Tone

A research report is a well-written scientific paper that is simple, accurate, and precise. The
language of the research output must be free from unfamiliar vocabulary and jargon. The
output’s tone must also be applicable for its purpose and audience.

Tone refers to “the writer’s attitude toward the reader and the subject of the message” (Alamis,
Villamarzo, & Ward, 2010: 93). Using either an objective tone for a positivist paper or a subject
tone for an interpretivist inquiry is an important step in avoiding the use of emotive language.
Researchers must endeavor to use language that have paradigmatic foundations and empirical
bases.

Here are some of the tips that writers of research reports must consider:
- Aim for concise and clear language
- Ensure objectivity
- Remain factual
- Assume an active voice
- Avoid uncommon terms

144 © UP CMC Department of Communication Research


18. POPULARIZING RESEARCH
Assistant Professor Jon Benedik A. Bunquin, MA

18.1. Overview

Many studies are sometimes left to collect dust in the bookshelves of libraries. We do not want
this to happen to our research reports which are a product of our hard work.

Disseminating research helps the public make an informed decision. Individuals and communities
can make better choices if they are fully aware of the issues and repercussions that attend
specific practices in communication and media. Research provides people the evidence which
they could use in making sound decisions.

This function of research goes hand in hand with the second reason why we disseminate
research: to transfer knowledge to stakeholders. Most of the time, research is conducted in
partnership with institutions. Given the highly technical nature of research, it is important to be
able communicate its findings effectively to stakeholders. These stakeholders may be aware of
the issues discussed in the research but may not be well-versed with its technical aspect.
Popularizing research means communicating our findings in a manner that can be understood by
these groups without losing the integrity of research data.

Finally, we disseminate research because we want our outputs to be utilized. We want our
theses and dissertations to turn into policies, our manuscripts to guide individuals and
communities, or our studies to serve as input for various strategies.

Research Popularization Strategies

We can employ three strategies in popularizing our research findings. The most popular form and
wide-reaching strategy is popularization through the media. This entails transforming research
manuscripts into various formats that could serve as content for media organizations, such as
news articles, blogposts, social media posts, radio and television advertisements, and even
guesting and interviews in various shows. The goal is to provide visibility to the findings from
one’s research study.

Another strategy is harnessing the power of networks. This entails tapping various organizations
and individuals who can aid in the utilization of research findings. Getting in touch with advocacy
and interest groups, lobbying, and engaging in dialogue with key people can fast track the
transfer and utilization of knowledge generated through research.

Finally, formal education can also serve as another venue in disseminating research findings.
Although a more academic approach is typically utilized when disseminating through formal
education, formats such as entertainment-education materials and school roadshows require a
more popularized approach to dissemination.

The most important question to ask is: For whom is the research? Knowing the audience of your
research is key before deciding the most optimum strategy. This is discussed in the next section.

A Primer on Communication and Media Research 145


18.2. Understanding the audience

18.2.1. Identifying audiences

When popularizing research, identifying, and understanding your audience is never just an
option—it is a requirement. An audience-centered approach to popularization maximizes the
potential of your research.

But we do not just select anybody. We always select specific audience segments to whom we
communicate the findings of our research. Note that when we use the term audiences, we
assume that the public is heterogenous and that each member consumes and understands
content differently.

So, our questions are as follows: How do we identify these audiences? What are our
considerations in selecting the receivers of our research?

In some instances, the specific audience segment is a given. If it is a commissioned research


project (i.e., conducted for a client or an organization), then the audience is the client or
organization who sponsored the research.

In other instances, the audience is yet to be identified, which is usually the case for research
produced in and by the academe. Scholars sometimes fumble in the dark trying to look for
people who will and should listen to what hundreds of their thesis pages have to say. We can use
the following criteria to identify audiences:

• Who participated in the research?

It is the ethical duty of researchers to return the findings of their studies to the communities
or samples from which the study was collected or constructed. Hence, as part of your
audience identification, include research respondents (for survey research), informants (for
interviews), participants (for FGDs or experiments), cases (for case studies), and subjects (for
ethnography).

• Who can benefit from the information provided by the research?

Who can be directly affected by the findings of the research? Based on the research findings,
what kind of people would benefit by knowing the information laid out? In the
“recommendation” section of the research, to whom does the research speak? It is not
enough to communicate to those that are directly involved in your research; examine who
the samples or subjects represent.

• Who can act on the findings of the research?

In some instances, it’s necessary to talk to people who can provide the means or resources
to act on the findings of the research. Decision makers in organizations, key influencers,
representatives of organizations, and other people in positions of authority or power could
be identified as audiences as well. Policymakers, for example, can draft policies based on
your research findings. Media, meanwhile, can help in adding mileage to your research.

146 © UP CMC Department of Communication Research


After pondering about these questions, identify audiences by separating them into two types of
audiences for the research dissemination plan:

• Primary Audience

Research dissemination is essentially a campaign; only that your message is based on the
findings of the research. The primary audience is considered as the most important
stakeholders of the findings of the research.

• Secondary Audience

These are the people who can help in realizing the findings of the research. They are leaders,
influencers, policymakers, and other bodies that also need to be informed because they can
transform research information to action.

18.2.2. Analyzing audiences

Understanding audiences does not only entail identifying the people with whom you will be
communicating, but also knowing their traits and characteristics. Good communicators analyze
their audiences. This entails understanding their characteristics, managing their expectations,
and providing them the kind of information they need. In general, we probe into two things:
audience demographics and audience psychographics.

• Demographics

Characteristics that are innate to the audience, such as age, sex, socioeconomic status, level
of education, race, location, and size, are called demographics. Understanding audience
demographics entails knowing their various characteristics, and the implications of these
characteristics. Furthermore, it helps researchers anticipate sensitivities that come with
certain demographic characteristics (such as race, sex, and culture, as examples).

• Psychographics

This refers to the cognitive (knowledge), emotive (attitude), and conative (behavior)
characteristics of the audience. These include the levels of knowledge they possess, their
expectations, fears, attitudes, aspirations, and egocentrism (concern for one’s welfare). This
helps researchers in crafting the message from their study, ensuring that it meets the
audiences’ expectations, provides them with information that they need, considers their
sentiments and opinions about issues, and engages them into action.

18.2.3. Crafting the key message

If the audience could remember only one thing from your research, what do you want it to be?
Often, when researchers present their work to other scholars, they bombard their slides with
blocks of texts or complicated tables and statistical models. In some instances, this is acceptable,
especially if the researcher is trying to communicate to an academic audience.

A Primer on Communication and Media Research 147


However, if researchers are trying to get the public to understand their research, this type of
presentation might not register well with their audience. Not all audiences appreciate statistical
tables or highly theoretical concepts. Some of them may not possess advanced knowledge in
statistics, or deep understanding of theory. Others may get bored easily at the sight of numbers
or thick descriptions. Hence, it is important to focus on a central idea which encapsulates the
findings or insights you want to share about your research.

This central idea is called the key message. Key messages guide the content of research
dissemination efforts. It ensures that every evidence, illustration, story, quotation, case, and fact
presented leads towards one compelling message.

The key message has three main characteristics:

a. Action-oriented—We always ask two questions in research: “so what?” and “what now?.”
The latter resonates well with this characteristic of the key message. A good message
motivates or persuades people or groups into taking action.

b. Specific—Key messages are operationalized. They specify the problem and action needed to
be taken to communicate research findings effectively.

c. Insightful—This means that a good key message demonstrates a clear understanding of the
issue and offers something new to its audience.

What we have done so far is design two messages from one research study, based on the
findings of the research. As mentioned earlier, key messages guide subsequent contents of a
research dissemination campaign. Think of it as the topic sentence of a paragraph: every
sentence must follow the thought of that topic sentence. In developing materials, whether as
text in a slide presentation, bullets in a brochure, AVP of a research finding, infographic in a
social media page, or a policy note for think tanks, be guided by the following messages qualities:

a. Credible—This means that the source of information is perceived to be knowledgeable and


compelling. Presenting evidence, such as statistical data and facts, also increases the
credibility of a message.

b. Engaging—This refers to the attractive and stylistic qualities of a message that stimulates
and sustains interest among audiences. This may also mean that the material is able to spur
emotions from the audience and hold their attention through entertainment.

c. Relevant—This means that receivers regard the information being presented to them as
relevant to their current situation.

d. Understandable—This means that the content of the material/s being presented are
designed in a way that match the audiences’ level of knowledge.

e. Possesses motivational incentives—This means that the audience are driven to action which
could be based on material or non-material rewards. This quality considers the audiences’
question: “what’s in it for us?”

Finally, in creating messages, it is important to laymanize your language. This means avoiding
technical terminology or jargon, and other words that may not be understandable by an
audience. It is writing in a way that’s familiar with an audience – not too rigid or structured.

148 © UP CMC Department of Communication Research


18.3. Developing materials for research popularization

The earlier sections of this chapter dealt mostly with the conceptual considerations of
disseminating research—whom to talk to and how to talk to them. This section presents the
operational aspect of research dissemination. Specifically, this section presents three ways of
developing materials for popularizing technical research outputs: for the visual aspect of
research popularization, this chapter will discuss techniques and principles in visualizing data and
creating presentations. Meanwhile, for the textual aspect of research popularization, this chapter
will include writing research briefs.

18.4. Visualizing data

The need to visualize data in aid of popularizing research rose from two things: on the one hand,
audiences have become bombarded with tons of information. The advent of the internet has
made information readily available to everyone, immersing audiences with so much data that it
has become immensely difficult to make sense of it. As communicators, it is our role to make
sure that audiences can maximize all the available information by making them comprehend it
through better storytelling. Visualizing data enables audiences to understand numerical data and
see patterns of information better, and aid in audiences’ sensemaking. Moreover, researchers
can take advantage of data visualization to communicate their findings better to audiences.

On the other hand, the development of technology has also made software for data visualization
available to everyone. As communicators, we can take advantage of these tools to help us reach
our target audiences. Tools such as MS Excel, Tableau, Vizable, Google Sheets, Chartbuilder, and
Infogram, among others, are just some of the many software that we can use to visualize data. If
it is available for use, then why not use them?

Now, there are three tasks at hand for researchers who intend to visualize their data:

a. Thinking visually—How do we transform numbers or words into something that can be


easily identifiable and perceived?

b. Understanding context—Who are you speaking to? What chart elements would best
communicate to your audience?

c. Communicating ideas—What is the story behind the data? What is the key message that
you’re trying to communicate?

As most experts would say, data visualization is more than just creating charts. It’s about telling
stories. Hence, it is essential to understand the key message that is being communicated by your
research prior to executing them into various popularized outputs, such as charts and graphs.

A Primer on Communication and Media Research 149


18.4.1. Creating charts

The three most commonly used types of charts are column/bar, line, and pie. This section
discusses these three and provides tips on how to use them.

• Column/Bar Graph

Bar graphs are considered as the most flexible of all charts. However, that does not mean
that we can use these for everything. Here are some guidelines you can follow in creating
bar graphs:
- Use bar graphs to illustrate categorical data
- Arrange the values from lowest to highest so that your readers could easily spot the
significant values (the lowest scores and highest scores).
- Use only one color to denote a specific data series; use contrasting colors to accentuate
or emphasize specific categories
- Remove unnecessary grid lines
- Set the minimum Y-Axis value to ZERO

Column or bar graphs (Figure 1) are commonly used to compare amount or magnitude
between categories. The differences in height or length make it easy for readers to compare
categories. Column graphs could either be clustered (Figure 2), to show series of data within
categories, or stacked (Figure 2), to show subgroups within categories.

Figure 1 Figure 2

Simple Bar Graph Clustered Bar Graph


18 18
15
15 12 12
12 8 9
6
4
2

Category 1 Category 2 Category 3

Category 1 Category 2 Category 3 Series 1 Series 2 Series 3

• Line Graph

Line graphs utilize changes in slope of line segments to compare differences in magnitude
across various points. Typically, they are used to show changes in value/magnitude over
time (Figure 3). Multiple line graphs display changes of multiple categories over time (Figure
4).

Line graphs are useful to show changes of categories. Here are some guidelines you can
follow in creating line graphs:
- Use line graphs to illustrate time-based changes
- Avoid presenting line graphs with more than five categories/data series to avoid a
“tangled” line graph
- Add markers/data points to help readers track the changes, but do not make the
markers too big/obtrusive
- Remove unnecessary grid lines
- Set the minimum Y-Axis value to ZERO

150 © UP CMC Department of Communication Research


Figure 3. Simple Line Graph Figure 4. Multiple Line Graph
20 20

15 15
Axis Title

Axis Title
Category 1
10 10
Category 2
5 5 Category 3

0 0
2007 2008 2009 2010 2011 2012 2013 2014 2007 2009 2011 2013

• Pie Graph

Pie graphs are used to show composition. This means that the data must total to 100%, and
that there are no overlapping categories. But here’s the thing: most pie graphs are
incorrectly designed, and don’t really help readers.

Consider Figure 5 example below. We frequently see something like this in reports. However,
this type of pie chart should be avoided. It is difficult for readers to identify which slice refers
to which category. It is also difficult to compare the differences among slices. It might be
better to opt for a bar graph instead, so readers can better understand the data. Now,
compare it to the pie graph in Figure 6:

Figure 5. Pie Chart with Many Categories Figure 6. Pie Chart with Few Categories
2
A 5
B
C
D A
10 25
E B
F C
G D
H E
I
J
15

As compared to the first pie graph, Figure 6 is easier to understand. You can easily distinguish
which category is the highest, and which one is the lowest. Moreover, differences in sizes are
more evident in this pie chart.

Another function of a pie graph is to show proportion of a segment in relation to a whole by


showing a pie graph consisting of two categories (Figure 7). A series of pie graphs, meanwhile,
can show progress across timepoints (Figure 8).

A Primer on Communication and Media Research 151


Figure 7. Pie Chart
comparing proportion of Figure 8. Multiple Pie Charts showing completion progress
one segment to a whole

Monday Wednesday Friday

A,
32% 32
% 55
B, %
68% 84
%

When creating pie graphs, be guided by the following tips:


- Use pie graphs to display composition of variables with five or less categories
- Add data labels to help readers discern the composition of the pie chart. This eliminates
the need for a legend
- Arrange the categories from highest to lowest in a clockwise manner
- Use contrasting colors to differentiate categories easier

18.4.2. Refining your visualization

Make stronger visualizations by considering three things: the structure of the visualization, the
clarity of the charts, and the simplicity of the layout.

• Structure

- Consistent structure—Ensure that when you create multiple data visualizations, these
contain the same elements all throughout. Most visualized data must contain the
following elements: a) title, b) subtitle, c) chart, and d) source line. Check out Figure 9
below from Pew Research Center.

Figure 9

152 © UP CMC Department of Communication Research


- Consistent placement and weighting—Try to maintain the proportions of the elements
across multiple data visualizations. If data labels are placed outside the line/shape, then
consistently do so throughout the other charts.

- Limit eye travel—Do not spread out elements too much. This avoids noise in the data,
and helps readers focus on what you’re trying to communicate. Legends are helpful, but
it’s always better to connect values to their visual counterparts. Try to avoid pointers.

• Clarity

Clarity is probably the penultimate goal of data viz specialists. It reaches the bliss point– the
AHA! Moment. It indicates that what is being communicated by the chart has been
understood by the audience
- Remove nonessential information to ensure nothing is extraneous
- Make sure that each element is unique and serves to support the visual
- Ensure elements are not ambiguous and send a clear message to the viewer
- Take advantage of conventions and metaphors (blue for cold, red for hot), etc

• Simplicity

Only present what is needed. Remove elements that are not necessary in the material. This
includes color, lines, shapes, and texts. Data visualization experts refer to this as chart junk,
and the most notorious form of chart junk is 3D design. In data visualization, we always
prioritize the clarity of the design over fancy special effects and illustrations. Choose a simple
look that can deliver the key message.

18.5. Creating presentations

Aside from writing and creating graphs, researchers usually do presentations about their
research findings. In this section, we discuss the principles and techniques in preparing and
conducting oral presentations.

18.5.1. Designing slide presentations

The structure and organization of slide presentations should be aligned with the story telling.
This stresses the idea that slides should support a speaker during the research presentation, and
not distract audiences from what the speaker says. The speaker is the star, not the slides.

The structure of the presentation may vary. It could follow a linear structure based on the
research paper (background, RRL, theory, method, findings, and recommendations). This is ideal
for communicating to colleagues.

It could also take on the reverse structure—beginning with the findings of the research, and the
steps that led to those findings. This structure is effective for communicating to more
popularized audiences.

A general to specific route is also employed in presentations, which means beginning from the
global, broader issues, before zooming in on specific factors or components. This strategy can
present nuances in the findings and communicate to specific actors regarding their roles.
Another technique in structuring presentations is starting with the simple findings first before
building into the more complex topics.

A Primer on Communication and Media Research 153


In designing slide presentations, Sue and Griffin (2016) list the following tips in creating better
slides:

• Eliminate slide junk

Slide junk, like chart junk, refers to non-essential content of slide presentations. In creating
effective and striking visuals, take out elements that do not really matter or does not
contribute anything in the storytelling, such as headers, footers, titles, logos, and page
numbers.

Check out the two slides below. Figure 10 contains slide junk, while Figure 11 has been
revised to present only the essential elements.

Figure 10 Figure 11

• Think visually and maximize visual elements

Slides are used as visual aids. Hence, they should contain more visual, rather than textual,
elements. Pictures, icons, logos, and graphs should dominate the presentation, not walls and
blocks of text.

However, researchers typically put in blocks of texts, which audiences simply read for
themselves. Presenters are also tempted to just read from the slides, and not explain the
material more spontaneously. This decreases the credibility and authenticity of the speaker
as well as reduces audience engagement.

For slides that contain more conceptual rather than data-driven discussions, use icons or
images, and transfer all the text into the notes. Check out Figures 12 and 13 to see how this
looks like. Meanwhile, maximize the graphs generated from the research and add them to
the presentation.

154 © UP CMC Department of Communication Research


Figure 12 Figure 13

When using images, make sure to maintain their proportions and maximize their visual
impact. As shown in Figure 14, presentation designers sometimes fail to do this by simply
placing an image in a text-filled slide. Figure 15, on the other hand, can maximize the visual
impact of the image, while retaining its textual elements.

Figure 14. Poor use of image Figure 15. Maximized use of image

• Consider the readability of the slides

Fonts, colors, alignment, and spacing must be considered in designing slides.

For fonts, the recommended minimum is 30pts to make sure that they are still readable for
audiences at the back of the venue. Limit font choice to sans serif fonts (such as Arial and
Helvetica) for body texts, as they are more neutral and readable. Meanwhile, serif fonts
(such as Times New Roman or Garamond) may be used for titles and headers. You may use
more than one font. Designers recommend using two to three fonts at most, provided that
their use is consistent throughout the material. For transferability of files, embed the fonts in
the file. True type fonts are usually embeddable.

For color, develop a scheme that will be used consistently throughout the presentation. To
illustrate, headers could be all navy blue, sub-headers could be sky blue, and body texts
could be dark grey. Contrast is another consideration in color. Make sure that there is
enough contrast between the text and its background for purposes of readability. Never use
a dark text color on a cool/dark colored background.

For alignment and spacing, make sure that there’s enough breathing space between textual
elements. Align the elements to the grid so they don’t look cluttered.

A Primer on Communication and Media Research 155


• Animations should be used to provide emphasis

When in doubt, do not use animations. However, when used effectively, animations can
provide a dramatic effect on a presentation. Use fade or wipe animation styles, as they
provide the most subtle yet effective way to emphasize elements. Fade and Wipe transitions
also maintain the smooth flow of a presentation. Quirky, dizzying animations like twirl and
spin should be avoided, as they just annoy audiences.

Use animations to show how things work. Animations could be used to direct eye movement
and show the process from one stage/phase to another. You can also use them to emphasize
certain items. Slipping in an animation in one slide surprises audiences, which can provide
impact for important details.

18.5.2. Delivering slide presentations

The star of the presentation should not be the well-designed presentation, or the flashy visuals.
You, the presenter, should be the focal point in a presentation, which means that your oral
delivery should be the priority. A good presenter could deliver the message well, even without
the presence of a visual aid.

When preparing for presentations, know how much time is allotted, write down your script,
rehearse your material, prepare the necessary equipment, and arrive early at the venue and do a
quick tech-run. Meanwhile, when delivering presentations, think of yourself as a storyteller. And
good stories always have a strong opening, an engaging body, and an impactful ending.

18.6. Designing poster presentations

Poster presentations are another way of presenting your research findings. Usually done in
conferences, poster presentations can display information and engage interested audiences. Two
things are considered in designing poster presentations: content and layout.

18.6.1. Deciding on the content

The nature of poster presentations is highly academic—but unlike research manuscripts, poster
presentations are thought of as “short stories.” The content is usually similar to that of an
abstract: stated in broader and direct terms. It contains only the key points of the research (Diffie,
n.d.) and audiences are usually engaged to know more about the study through the researcher,
typically standing near the poster.

Text is kept to around 250 to 500 words, to ensure that the poster will not simply be a research
manuscript printed on a large tarpaulin. It contains more visual, rather than textual elements,
and the goal is to deliver information as effectively as possible.

Poster presentations usually follow the structure of the research. It begins with the
background/context of the study, followed by the methods, results and discussion, and the
conclusion of the research. Some posters may also begin by highlighting the findings/conclusion
of the study, followed by the specific details that led to that conclusion. This is to catch the
attention of readers who are simply interested in the findings of a study, rather than read the
whole research.

Similar to slide presentations, keep jargons or technical information to a minimum. If a jargon


must be present in the content, then make sure that it is explained well, so that general
audiences and non-specialists are able to make sense of the information presented.

156 © UP CMC Department of Communication Research


18.6.2. Laying out the poster elements

When organizing elements in the poster, make sure that the flow of the research remains logical.
Visual cues may be used to aid readers in making sense of your posters. Using columns, for
example, can direct the readers’ eye movements. Figure 16 shows two types of poster layouts: a
horizontal and a vertical layout. Let’s examine the elements of these posters further.

Figure 16. Sample Poster

• Contrast

In designing posters, select a color scheme which can best communicate the findings of your
study and stick to that color scheme. This ensures that your layout has a consistent and
organized aesthetic. Watch out for the contrast in the color – text set in dark background
should be lighter and vice versa. Avoid using image as background in posters. The varying
colors can make text unreadable due to poor contrast. As shown in Figure 17, poor contrast
leads to poor readability.

Figure 17. Contrast

Resear Resear Resear Resear


ch ch ch ch
Good Contrast Poor Contrast

A Primer on Communication and Media Research 157


• Text Styles

Develop a style guide when using text to ensure consistency in use. Specify the font size,
style, and weight of the titles, headers, subheads, body text, and chart/graph labels.
- Titles should be readable from a distance.
- Headers take on the second largest font size in the poster
- Body texts should be at least 24 pts to ensure readability

Font choice is also important when it comes to styling text. Note that font choice is not
arbitrary – each type conveys a message. In general, San Serifs are the safest choice when it
comes to layout. The popular ones include Arial, Helvetica, and Verdana.

18.6.3. Writing research briefs

Research briefs (or sometimes referred to as policy briefs) are documents prepared for decision-
making. They are summaries of long research papers and are written for non-technical audiences.
They are typically prepared for “informed, non-specialists” (International Center for Policy
Advocacy, 2017, p. 10), which refers to people who are aware of the issues at hand, but do not
conduct any technical research about the subject matter. These could be NGO advocates,
policymakers, politicians, and journalists. They are not written for academic audiences; hence the
writing style is popular, non-technical, and non-academic, albeit professional and formal. Aside
from this, ICPA lists other characteristics of research briefs:
- Engaging—This means the document contains insightful discussions and thought-provoking
facts.
- Relevant and practical—This means that the document is framed in a manner that talks
about the issues of the audience and the questions they are asking. It also means that the
document offers recommendations that are actionable and realistic
- Succinct—This means the document is short and readable, as most of its intended audience
do not have the time to read think theses and dissertations.
- Limited—This means it considers only aspects of a larger issue that is relevant to its
audience. This entails taking out only a component of a larger study for the purpose of
disseminating it through a policy brief.
- Understandable and accessible—This means the document is free from highly specialized
language or jargon and is well-explained utilizing an easy-to-follow structure. The document
is also laid out for easy reading. Elements such as graphs, tables, headers, and subtitles must
be considered.

How do you write research briefs? Here are some tips:


- Write the title in a snappy, yet informative manner—Avoid wordy titles that contain too
many technical terms.
- Provide an executive summary—Before the introduction, include a one paragraph write-up
that states the problem, core findings and recommendations based on the research. This is
like an abstract of a thesis.
- Explain the summary of the problem, and why it is important—The goal is to make the
issue more urgent and salient to the readers and stress the importance of taking action.
- Explain the methods of data gathering and analysis in brief—Synthesize existing literature
and data and provide a short explanation regarding the method of data collection.
Remember to use non-specialist language in writing this section.
- Write the results—State the specific research questions as sub-headers and then write the
findings as answers to those questions. The results may also be written following the
framework of the study.

158 © UP CMC Department of Communication Research


18.7. Disseminating in non-traditional formats

Academic conferences, peer-reviewed publications, and policy notes are considered as the
traditional venues for disseminating research results. To maximize the potential of evidence
found in research, findings must be disseminated to audiences outside of the research
community. Through the internet, researchers can now reach a larger audience and disseminate
the findings of their research.

18.7.1. Engaging audiences through social media

Social media has opened venues for connection for people, and this includes researchers. In fact,
a study has revealed that studies that are mentioned on Twitter are more likely to be cited by
others (Eysenbach, 2011), demonstrating how social media is linked to greater research visibility.
Researchers who are active online also get a chance to interact with their readers, answer and
clarify questions, and provide additional insight beyond their manuscript.

Engaging entities beyond the academe is essential in disseminating research and social media is
able to provide that venue. The DRIFT (Disseminating research information through Facebook
and Twitter) framework by Ryan and Sfar-Gandoura (2018) has shown that research information
shared through the aforementioned social media platforms were able to generate high levels of
local and international engagement (Ryan & Sfar-Gandoura, 2018). Creating Facebook pages,
engaging in Twitter conversations, and popularizing research-related hashtags are just some of
the ways through which researchers are able to harness the power of social media. Other
websites such as LinkedIn, Academia, and ResearchGate can also open venues for social media
engagement between researchers and different audiences.

18.7.2. Making research available through digital repositories

Another way of making your research accessible is to deposit your studies in digital repositories.
These are platforms which aggregate various forms of research, such as theses and dissertations,
journal articles, policy notes, and unpublished studies. Papers in open-access digital repositories,
or platforms that can be accessed free of charge (or without a paywall), have been observed to
substantially increase the impact of a research paper (Gargouri, et al., 2010). This is because
open access repositories enable users to discover studies easily. For disciplinary repositories, or
those which aggregate studies on a specific academic discipline, they serve as a one-stop shop
for related studies, aiding students and scholars gain a better understanding and grasp of the
“state-of-the-field,” especially among repositories with extensive and comprehensive collections.

18.7.3. Self-publishing through blogs and podcasts

Another way of engaging audiences is through blogs and podcasts. This provides a venue for
researchers to share reflections, opinions, and other ideas. Researchers also get to share their
work and engage others through discussion forums through blogs. There are a number of
blogging platforms available online, such as Wordpress and Medium, which are all free and
highly customizable.

Podcasts, meanwhile, are another way to make research accessible to audiences. These are
audio recordings wherein researchers describe and discuss findings of their research. In some
cases, it features a radio interview style format for a more engaging discussion of research. A
number of institutions and researchers have begun utilizing this format to deliver research
findings. It is also a convenient medium for listeners to learn from research findings; audiences
can absorb research findings easily and hassle-free.

A Primer on Communication and Media Research 159


160 © UP CMC Department of Communication Research
Epilogue
DOING RESEARCH IN THE POST-PANDEMIC ENVIRONMENT
1
by Professor Violeda A. Umali, PhD and Assistant Professor Ma. Aurora Lolita L. Lomibao, MA

The Department first engaged in the project of working on this Handbook in 2018, to contribute to the existing
resources on media and communication research. However, the pandemic and the subsequent lockdown in
2020 had major impacts on the research and academic environment, leading to changes in the ways we
conceive of, and practice, research. This drove the Department to ponder on what a post-pandemic research
environment looks like, and how we can configure our student mentoring to better adapt to the “new
normal.”

The new normal does not only present challenges but also offers opportunities for researchers. We should
thus take equal note of the (a) constraints and threats and (b) new possibilities (for topics, frameworks, and
design) that the new normal poses for research practice.

The practical and ethical considerations for doing research in the new normal can be depicted into a house
with four pillars and four walls.

The four pillars are Safety, Compassion, Rigor, and Ethics. These four pillars are not new concepts; they are
staple considerations when we do research. But in the new normal, these four pillars now carry nuances that
were not so pronounced before the pandemic happened.

The walls refer to the major phases in the research process: Conceptualization, Data Analysis and
Interpretation, Data Gathering, and Report Writing.

Doing Communication Research in the New Normal

1
This was presented in Session Four of the Comm Res Conversation Series that was livestreamed in the UP
CMC Communication Research Facebook page on 01 September 2020.

A Primer on Communication and Media Research 161


The Four Pillars

Pillar 1: Safety

This pillar covers the physical, psychological, economic, and social conditions of both the researchers and the
informants/respondents. It is now upon us to manage the safety of those involved in the study. For example,
there is a need to balance a) the need for interaction and immersion with people and communities, against
the need for social distancing and b) the need for fieldwork versus the reality of reduced physical mobility. Use
a risk-benefit analysis to determine whether studies can be safely continued, continued with modifications, or
temporarily halted. Anticipate disruptions to the study.

When appropriate, consider revising protocol to allow data collection and interaction without in-person
contact, using phone or videoconference apps such as Zoom. For field research, for example, students can first
find out the most recent local and national guidelines from the DOH, IATAF, and LGUs before going out. This
way, they would learn about safety guidelines such as the use of masks and face shields, the location of high-
risk areas, the curfews in effect, and the available modes of transport, among others.

Moreover, it is imperative for researchers to wear the necessary safety equipment – face masks and face
shields at a minimum, even gloves and PPEs; and make sure they have access to safe transportation. Since
both researchers and the people they study have a risk of exposure to COVID-19 in face-to-face interactions,
ideally, a phone call should update the research participant on the current COVID-19 information and test
them before interviews.

Here are some guidelines to consider:


- Discuss the risks with participants, so they can decide whether to still participate or not.
- Provide private rooms for interviews.
- Sanitize the high-traffic and high-touch areas well.
- Minimize direct contact with the interviewee.
- Do not share pens for signing forms.
- Have protective gear such as masks and hand sanitizers ready for the participants.
- Reconsider whether to have face-to-face interactions with immunocompromised people, pregnant
women, and older adults.
- Provide debriefing and psychological support to participants.
- Discourage face-to-face FGDs.

Pillar 2: Compassion

Compassion means increased sensitivity on our part to the new economic and social realities of both the
researcher and informants/respondents. It means noticing the emotions of others, and being motivated to
provide relief, to ease feelings of uncertainty. If there was ever a time to practice compassion, it is now more
than ever.

Researchers, whether faculty members or students, need a lot of support due to the troubling circumstances
they face on a daily basis. Not everyone is ready to share stories or express their feelings about what they are
going through. But, on occasion, they will feel needy, vulnerable, pressured, scared, uneasy, uncertain about
their future. They may feel grief, loss, sadness, exhaustion. They may be grappling with major changes in their
lives, like pressing family responsibilities or financial needs. They encounter an endless, panic-inducing blizzard
of information and misinformation about the state of the world and the country and may feel confused and
hopeless.

162 © UP CMC Department of Communication Research


But compassion should be extended not just to the researchers, but to the informants/respondents as well.
Communities that we used to consider as “safe” and unproblematic to conduct research in may now be more
suspicious of having strangers in their midst. What may once have been so-called “neutral” topics or themes
may trigger negative emotions in some people that we interview.

Pillar 3: Rigor

Rigor, in the context of scientific research, refers to “the quality of being thorough and accurate” (Cypress,
2017). Rigor runs through the entire research process from conceptualization to implementation,
dissemination, and evaluation. Therefore, a rigorous research has to tick all these boxes:

□ Research question is properly formulated.


□ Theoretical anchors, and research design and methods are appropriate for research objectives.
□ Study’s design and implementation are aligned with theoretical framework.
□ Research methods are implemented “scrupulously and meticulously” (Pickard, 2013) .
□ Research results are interpreted properly.
□ Full disclosure is observed in research report.
□ Manner of reporting is adapted to the target audience.
□ Report is well-written.
□ Evaluation of research is undertaken with objectivity and thoroughness.

The requirement of Rigor may seem daunting. Some might be thinking: “We are in a pandemic situation;
should we be strict about Rigor? There are both long and short answers to that question.

The short answer is “Yes, we should still be strict with Rigor even under the current circumstances.” Here are
three explanations for this answer:

First, there is a simple but very forceful argument about maintaining academic excellence even in pandemic
times. UP Diliman Chancellor Fidel Nemenzo said, for example, that notwithstanding the strains of living in the
new normal, the University will continue to uphold the twin values of compassion and academic excellence. He
further pointed out that we should not create a false dichotomy between these two values and instead regard
them as “mutually reinforcing values that will help us survive this crisis and face the challenges of the post-
COVID-19 world with honor, integrity and solidarity” (Nemenzo, 2020, para. 14).

Stated in another way, on the one hand, Compassion requires that when we make research decisions, we
should give due consideration to the well-being of both the researchers and the informants/respondents. At
the same time, however, our concern for feasibility should not compromise the standards of scholarly
research. Academic excellence - which in this case equates with Rigor - demands that whatever choices we
make, they should not be at the expense of the standards of quality research.

Second, when we look at the requirements for Rigor enumerated above, it becomes clear that we cannot
waive any one of them even under the new normal. For example, it is not possible to say that because there is
a pandemic, the research question may not be properly formulated, or that the theoretical framework is not
well-formulated, or that a report cannot be well-written, among others. To accomplish Rigor, effort must be
exerted; otherwise, it cannot be called Rigor.

Third, Rigor is achievable. Although Rigor itself is non-negotiable, there are differences in achieving Rigor,
depending on one’s research design and methods. This allows for some “room for maneuver” as far as
complying with standards of Rigor is concerned. Research questions lend themselves to a variety of methods
for finding the answers. But there will always be a preferred way to answer a research question. From the
nature of the research question alone, one would know that the best way to answer it is, for example, through
an interpretivist approach rather than a positivist approach, or through an experiment rather than a survey, or

A Primer on Communication and Media Research 163


through a combination of quantitative and qualitative approaches rather than using only a single approach.
However, sometimes, the preferred way is difficult to implement because of certain constraints. For example,
if there are not enough resources to conduct an experiment or it is difficult to access the texts to analyze, then
alternative approaches can be adopted. In making these decisions, rigor requires that you take stock of the
impact of your choices on the outcome of your research.

What is the bottom line? Rigor is not rigor mortis. It is not inflexible to the point that there is only one, non-
negotiable way to implement a study. It is not a mortal sin if one is not able to do the most ideal way of doing
a study if there are valid and significant resource constraints, security reasons, or some other legitimate
factors that constrain the conduct of research. One must, however, always disclose the scope and limitations
of the study. Findings must also be reported with these limitations in mind.

Pillar 4. Ethics

There is a section on Ethics in this resource material, so - remember the two guiding principles in the
responsible conduct of research: Beneficence, which is the obligation to do good, and Non-maleficence, which
is the obligation to do no harm

There are additional general ethical lookouts under the new normal:
1. Copyright, intellectual honesty, and data privacy. Mentors should ensure that the integrity of the
texts, and any other materials used in research, are respected.
2. Privacy and confidentiality. In the same vein, informant/respondent rights over their private
information should be a major concern.
3. Disclosure and informed consent. Informant/respondent trust should be maintained by assuring that
they have full understanding of the nature of the research, and any possible consequences of their
participation in it.
4. Codified research protocols. Students, mentors, and administrators should discuss and agree on the
implications of the new normal on the institution’s research activities, and then codify and
disseminate them properly.

The Four Walls

Wall 1: Research Conceptualization

What is the desired output of conceptualization? A study that is (1) feasible, or could be implemented given
the resources (time, money, technology, as well as competencies of the researcher) at the students’ disposal,
and (2) meets the standards of scholarly research. To get to this goal, we have to do a lot of going back and
forth between the theoretical and practical considerations to consider for implementing the study.

In the new normal, what are the additional things to take into account—aside from the ones that are already
stated in the textbooks and this resource material?

164 © UP CMC Department of Communication Research


Here are some practical and ethical guideposts:
1. Explore possible new topics and approaches for research in the new normal. There are many online
resources that you could consult regarding new research ideas in the post-pandemic world.
2. Assess choice of research focus vs. safety and access to resources.
3. Assess research focus vs. potential harm that it can bring to the research participants.
4. Research focus must satisfy standards of scholarly significance set for research outputs at a particular
level of the students’ academic program. For instance, there are different expectations for a class
term paper and a thesis, or for an undergraduate and a graduate thesis.
5. Recalibrate research plans by exploring the following options:
• To not recalibrate the research focus and forgo implementation of the study—The research
design will be fully fleshed out, with outputs to include pre-tested/finalized instruments. The
study will not be implemented; instead, the research report will list recommendations for the full
implementation of the study.
• To not recalibrate the research focus and undertake a pilot implementation of the study—The
research design will be fully fleshed out and the study will be implemented on a smaller scale
than what the original design calls for. This could be an option for studies that require fieldwork,
e.g., ethnography, experiments, surveys, and case studies.
• To recalibrate research focus by choosing another research method which may be “less robust”
but is more feasible to implement—For example, do a quasi-experiment instead of a true
experiment, do a survey instead of an experiment, or do a case study instead of ethnography.
• To recalibrate research focus by scaling down some aspects of the research design – For example,
reduce the number of concepts to be covered by the study, reduce the number and types of
indicators/measures of concepts/variables, choose a sampling scheme that is easier to
implement, etc. The limitations arising from the scaling down should be duly acknowledged. At
the same time, the scholarly value of the study - despite the scaling down to be implemented -
should be affirmed.
• To opt out of the research project when “the most ethical response is to weigh the value of
research itself against the dangers, rather than merely seeking ways to continue while minimizing
danger” (Carayannis & Bolin, 2020, para. 4).

Walls 2 and 3: Data Gathering/Generation, Analysis, and Interpretation

Researchers during the pandemic are presented with challenges on mobility and access to participants and
resources. At the same time however, they have opportunities to access more online materials, crowdsourcing
for resources.

Here are some practical and ethical alerts for the different research approaches:

For Quantitative Research


1. Require transparency on respondent decisions.
2. Expand consent forms, to include pandemic risks and precautions.
3. Include safety concerns, such as social distancing and protective equipment, for offline experiments,
and data security for online experiments.
4. Ensure sampling representativeness as much as possible.
5. Mandate truthfulness in reporting results.

For Qualitative Research


1. Acquire safety and health guidelines for field research from national and local authorities.
2. Guarantee protective equipment and safe transportation.
3. Reconsider inclusion of COVID high-risk populations as participants.
4. Reimagine “community immersion.”
5. Assure no deception occurs.
6. Assure reflexivity.

A Primer on Communication and Media Research 165


Wall 4: Report Writing and Dissemination

In reporting your research, here are some practical and ethical guideposts:
1. Ensure completeness of the research report.
2. Ensure compliance with standards of rigor and ethics.
3. Provide insights and recommendations about doing research in the new normal.
4. Make sure dissemination activity does not disadvantage some students, particularly those with no or
limited internet connectivity.
5. Have clear authorship policies.
6. Explore venues for dissemination but beware of predatory journals and conferences.

Moving forward

In surmounting the challenges of the new normal, researchers can be motivated by welcoming the DAWN of a
NEW ERA in communication and media research:

Development of a positive mindset


Assessment of new research ecology based on two major concerns: Safety and Ethics
D.A.W.N. Willingness to balance compassion with academic rigor
Negotiating the new research ecology to achieve feasibility without sacrificing quality
New look at established research practices
N.E.W. Expansion and recalibration of research practices in response to the new normal
Writing down these responses
Emphasis on beneficence and non-maleficence
E.R.A. Respect for intellectual property
Affirmation of privacy and confidentiality

166 © UP CMC Department of Communication Research


A Primer on Communication and Media Research 167

You might also like