You are on page 1of 71

Research Principles

ABR Quantitative
What is this course about
Gathering consumer information for informed decision making
• Scientific process for gathering information;
• The research process;
• Framing appropriate research question based on problem in hand;
• Comprehension of methodical implications;
• Looking for appropriate measures or available scales; and finally
• Applying appropriate statistical techniques to conclude as clearly as possible.
Use of SPSS and Smart-PLS software will be taught as a tool for analyzing
data
Today’s Learning Objectives
• Discuss the nature of quantitative research.
• Identify various forms of research questions.
• Understand theory.
Prescribed Text
• Bryman, Alan, (2012). “Social Research Methods”, 4th Edition
• Chapter 7: The Nature of Quantitative Research
• Neuman, W. Lawrence (2011). “Social Research Methods: Qualitative
and Quantitative Approaches”, 7th Edition
• Chapter 7 of : Qualitative and Quantitative Measurement, 7/e.
Project Planning (Gantt Chart)
Research Project
1 2 3 4 5 6 7 8 9 10 11 12 13 14
Plan
Topic Finalization

Questionnaire

Literature Review
Methodology and
Sampling
Descriptive Analysis

Inferential Analysis

Final Draft
Redefining Marketing Research (1 of 2)
The American Marketing Association (AMA) redefined
Marketing Research as:

• Marketing research is the function that links the consumer, customer, and public to
the marketer through information — information used to identify and define
marketing opportunities and problems; generate, refine, and evaluate marketing
actions; monitor marketing performance; and improve understanding of marketing as
a process.
Redefining Marketing Research (2 of 2)
Used to identify and define
market opportunities and
problems

Generate, refine, and


Information evaluate marketing
performance

Monitor marketing
performance

Improve understanding of
marketing as a process
Definition of Marketing Research
Marketing research is the systematic and objective
– identification
– collection
– analysis
– dissemination, and
– use of information
For the purpose of improving decision making related to the
– identification and
– solution of problems and opportunities in marketing
Definition of Marketing Research
Marketing research is the systematic
• systematic planning is required at all stages of the marketing research
process.
• The procedures followed at each stage are methodologically sound, well
documented, and, as much as possible, planned in advance.
• Scientific method is used: data are collected and analyzed to test prior
notions or hypotheses.
Definition of Marketing Research
Marketing research is the objective
• Marketing research attempts to provide accurate information that reflects a
true state of affairs.
• It is objective and should be conducted impartially.
• It should be free from the personal or political biases of the researcher or the
management.
Definition of Marketing Research
• Marketing research involves the identification, collection, analysis,
dissemination, and use of information.
• We identify or define the marketing research problem or opportunity and then
determine what information is needed to investigate it. The terms “problem”
and “opportunity” are used interchangeably here.
• Next, relevant information sources are identified and a range of data collection
are evaluated for their usefulness.
• Data is collected using the most appropriate method; it is analyzed and
interpreted, and inferences are drawn.
• Finally, findings, implications, and recommendations are provided in a format
that allows information to be used for marketing decision making and to be
acted upon directly.
Definition of Marketing Research
Marketing research is the systematic and objective
– identification
– collection
– analysis
– dissemination, and
– use of information
For the purpose of improving decision making related to the
– identification and
– solution of problems and opportunities in marketing
Classification of Marketing Research
Problem-Identification Research
• Research undertaken to help identify problems which are not
necessarily apparent on the surface and yet exist or are likely to
arise in the future. Examples: market potential, market share,
image, market characteristics, sales analysis, forecasting, and
trends research.

Problem-Solving Research
• Research undertaken to help solve
specific marketing problems. Examples:
segmentation, product, pricing,
promotion, and distribution research.
A Classification of Marketing Research
Marketing Research Process (1 of 2)
Step 1 : Problem Definition
Step 2 : Development of an Approach to the Problem
Step 3 : Research Design Formulation
Step 4 : Fieldwork or Data Collection
Step 5 : Data Preparation and Analysis
Step 6 : Report Preparation and Presentation
Marketing Research Process (2 of 2)
Step 1: Defining the Problem

Step 2: Developing an Approach to the Problem

Step 3: Formulating a Research Design

Step 4: Doing Field Work or Collecting Data

Step 5: Preparing and Analyzing Data

Step 6: Preparing and Presenting the Report


Quantitative Research
• Measurement of social variables

• Common research designs: surveys and experiments

• Numerical and statistical data

• Deductive theory testing

• Objectivist view of reality as external to social actors (ontology)

• Positivist epistemology
Objectivist Ontology
• Social phenomena confront us as external facts

• Individuals are born into a pre-existing social world

• Social forces and rules exert pressure on actors to conform

• e.g. culture exists independently of social actors who are socialized


into its values
Positivist Epistemology
• Application of natural science methods to social science research

• Phenomenalism: knowledge via the senses

• Deductivism: theory testing

• Objective, value-free researcher

• Distinction between scientific and normative statements


Quantitative Research Methodology
In sum, quantitative research methodology:
• Entails the collection of numerical data,
• Exhibits a deductive relationship between theory and research and a
predilection for a natural science approach (and of positivism in
particular),
• and has an objectivist conception of social reality.
Deductive and Inductive Theory
• Deductivism:
• theory --> data
• explicit hypothesis to be confirmed or rejected
• quantitative research

• Inductivism:
• data --> theory
• generalizable inferences from observations
• qualitative research /grounded theory
The Deductive Approach
• Observe a phenomenon
• Consult relevant theories about the phenomenon
• Generate explanations (hypotheses) for those observations
• Make predictions
• Gather data to support those predictions. To do this, you need to design a
system of measuring those observations
• Analyze data
• The analysis supports or rejects your hypotheses
• If rejected, then you may need to revise your theory
• Research questions often stem from observations – these observations can be made in
passing, or they might come as a result of previous research. For example, your
exploratory research tells you that working professionals tend to shorten, or altogether
skip, breakfast. You may wonder if this is due to the time required to prepare breakfast.
This is your research question.
• A hypothesis is, in simple terms, the answer to your research question. From your
question, you would derive a hypothesis based on your background research. You might
hypothesize, “Working professionals shorten or skip breakfast because they do not
have time to prepare it.”
• A prediction, on the other hand, is specific to the study and experiment that you design
to test your hypothesis. It is the outcome you would observe if your hypothesis were
supported.
• Predictions are often written in the form of “if, then” statements, as in, “if my
hypothesis is true, then this is what I will observe.” for example, you may experiment
with ready-prepared and cooking from scratch for different groups of people.
• A refined prediction statement would then be “if working professionals choose to have
breakfast based on the time available to them, then when time is restricted they will
have ready-prepared breakfast and when there is abundance of time they will choose
Brief Description of Research Process
Management Research Question Hierarchy

Participant Perspective

Scholar/Researcher Perspective

Management/Practitioner
Perspective
The management dilemma is usually a symptom of an
actual problem, such as:
1. Rising costs.
2. The discovery of an expensive chemical compound that would increase the
usefulness of a drug.
3. Increasing tenant move-outs from an apartment complex.
4. Declining sales.
5. A larger number of product defects during the manufacture of an
automobile.
6. An increasing number of letters and phone complaints about postpurchase.
1. The management dilemma can also be triggered by an early
signal of an opportunity or growing evidence that a trend may
be gaining staying power.
• Identifying management dilemmas is rarely difficult.
• Choosing one dilemma on which to focus may be difficult.
• Choosing incorrectly may result in a waste of time and
resources.
2. Management question: the management dilemma restated in
question format.
3. Research question(s): Management questions translated into
researchable form; the hypothesis that best states the objective of
the research; the question(s) that focuses (focus) the researcher’s
attention.
4. Investigative questions: questions the researcher must ask to
satisfactorily answer the research question(s); what the decision
maker feels he/she needs to know to arrive at a conclusion about
the management dilemma.
5. Measurement questions: the questions asked of the
participants or the observations that must be recorded.
The definition of the management question sets the research task.
Management Dilemma and Management Research
Questions
• Management Dilemma:
• written as series of statements, not questions
• Captures the essence of the pressing situation faced by a group of
practitioners
• Often aligns with, or compliments, the formal “Problem or
Opportunity Statement”

• Management Research Questions:


• Written as questions (ending in a ?) that reflect questions
practitioners and scholars are asking
• Typically do not include terms and language reflecting theory or
abstract constructs
Management Dilemma and Management Questions -
Examples
• Management Dilemma:
“Our organization is having difficulty attracting and retaining qualified employees,
and this threatens our competitive advantage and increases costs by having to
continually recruit and train new employees”

• Management Questions:
• Why are people leaving our organization?
• What can we do to keep our best performers here?
• What can we do to attract more qualified personnel?
• How can we reduce costs associated with turnover?
Management Research Questions
• Written from the perspective of scholar/researcher examining the
phenomenon through a theoretical framework

• May use technical terms that reflect theoretical constructs and variables

• Good research questions imply something about the methodological


approach – quantitative or qualitative – either the specific research design or
the analysis
Management Research Questions - Examples
• Qualitative Research Questions:
• What dimensions of an organization’s culture contribute to
employee retention and turnover? (possible ethnographic study)
• What factors do employees describe about their workplace
experience that contribute to their intentions to either remain with
a company or seek employment elsewhere? (possible
phenomenological design)
• Quantitative Research Questions:
• What is the relationship between employee satisfaction and
motivation (both as measured by valid and reliable instruments)?
(Correlation)
• Are there differences between men and women in their level of
satisfaction with workplace climate? (t-test)
Investigative/Measurement Level Questions
• Written as survey or interview questions as they would be presented to
participants.

Note: Might also be questions that guide one’s review of existing data sources or
archival documents.

• Written at a reading level and using terms appropriate for the population being
studied.

• Typically do NOT include complex terms or technical language used by the


scholar, researcher, or theoretician.
Investigative/Measurement Level Questions
Qualitative Investigative/Measurement Questions:
• IQ: What are the indicators of employee satisfaction?
• MQ: what do you find most satisfying about your work in this company?
• What do you find frustrating about your work in this company?
• Why did you choose to leave the company?
Quantitative Investigative/Measurement Questions:
• IQ: How do employees perceive the management attitude?
• Supervisors treat employees with respect
1 2 3 4 5 6 7
Strongly Disagree Strongly Agree
• Sufficient time is provided to accomplish my work objectives
1 2 3 4 5 6 7
• Management Dilemma:
“Our organization is having difficulty attracting and retaining qualified employees, and this threatens our competitive advantage and increases costs by having to continually recruit and train new
employees”
• Management Questions:
• Why are people leaving our organization?
• What can we do to keep our best performers here?
• What can we do to attract more qualified personnel?
• How can we reduce costs associated with turnover?
• Qualitative Research Questions:
• What dimensions of an organization’s culture contribute to employee retention and turnover? (possible ethnographic study)
• What factors do employees describe about their workplace experience that contribute to their intentions to either remain with a company or seek employment
elsewhere? (possible phenomenological design)
• Quantitative Research Questions:
• What is the relationship between employee satisfaction and motivation (both as measured by valid and reliable instruments)? (Correlation)
• Are there differences between men and women in their level of satisfaction with workplace climate? (t-test)
Qualitative Investigative/Measurement Questions:
• IQ: What are the indicators of employee satisfaction?
• MesQ: what do you find most satisfying about your work in this company?
• What do you find frustrating about your work in this company?
• Why did you choose to leave the company?
Quantitative Investigative/Measurement Questions:
• IQ: How do employees perceive the management attitude?
• MesQ:
• Supervisors treat employees with respect
1 2 3 4 5 6 7
Strongly Disagree Strongly Agree
• Sufficient time is provided to accomplish my work objectives
1 2 3 4 5 6 7
Investigative and Measurement Questions
• Investigative Question:
a. What is consumer perception about Lux liquid soap vis-à-vis its competitors?
b. Where does Lux liquid soap consumer shop?
• Measurement Question:
a. Rate Lux liquid soap’s overall performance on a scale from 1 to 10 with 1 = very
poor performance and 10 = excellent performance
1 2 3 4 5 6 7 8 9 10
(Same question repeated for competitors such as Dove, Safeguard etc)
b. From which retail outlet do you usually buy your liquid soap.
__________________________________
Detailed Description of Research Process

Full Research Circle


Concepts: Conceptualization
Theory
• A supposition or a system of ideas intended to explain something,
especially one based on general principles independent of the thing to be
explained.
• Theory is about connections among phenomena, a story about why acts,
events, structure, and thoughts occur.
• Theory emphasizes the nature of causal relationships, identifying what
comes first, as well as the timings of such events.
• Theory is usually laced with a set of convincing and logically interconnected
arguments.
• A good theory explains, predicts, and delights.

Source: Sutton & Staw, (1995). What Theory is not. Administrative Science Quarterly, 40(3), 371-384 (p. 378).
Parts of a Theory
• Assumptions

• Concepts

• Relationships

• Unit of Analysis
Parts of a Theory: 1) Assumptions
• Assumptions: An untested starting point or belief in a theory that is
necessary in order to build a theoretical explanation.
• TPB assumes that individuals act rationally, according to their
attitudes, subjective norms, and perceived behavioral control. These
factors are not necessarily actively or consciously considered during
decision-making, but form the backdrop for the decision-making
process.

• TAM assumes that its belief constructs - perceived ease of use (PEOU)
and perceived usefulness (PU) - fully mediate the influence of external
variables on IT usage behavior.
Parts of a Theory 2) Concepts
• Theoretical concept: An idea that is thought through, carefully defined,
and made explicit in a theory.
• Symbol
• Definition
• (Jargon)
• Concepts exist outside of social science theory.
• They are everywhere, and we use them all the time.

• Level of abstraction: A characteristic of a concept that ranges from


empirical and concrete (Lion; Smile), often easily observable in daily
experience, to very abstract (Feline/animal/living being; Happiness,
satisfaction, contentment/Emotion), unseen mental creations.
Parts of a Theory 2) Concepts
Concepts examples:
Abstract
CONCEPTS
Level structure, agency, social class, job search
method, emotional satisfaction,
religious orthodoxy, religious
orientation, preservation of self,
informal social control, negotiated
order, culture, academic achievement,
teacher expectations, charismatic
leadership, healthy lifestyle, conversion
OBSERVATION OF OBJECTS Empirical
AND EVENTS (REALITY) Level
What is a Concept?
• Concepts are:
• Building blocks of theory
• Labels that we give to elements of the social world
• Categories for the organization of ideas and observations (Bulmer, 1984)

• Concepts are useful for:


• Providing an explanation of a certain aspect of the social world
• Standing for things we want to explain
• Giving a basis for measuring variation
Parts of a Theory: 2) Concepts
• If a concept is to be employed in quantitative research, it will have to
be measured.

• Once they are measured, concepts can be in the form of independent


or dependent variables.

• In other words, concepts may provide an explanation of a certain


aspect of the social world, or they may stand for things we want to
explain.
Why Measure Concepts?
• To delineate fine differences between people, organizations, or any
other unit of analysis in terms of the characteristic in question (e.g.
different age groups; different levels of preference or satisfaction).

• To provide a consistent device for gauging such distinctions: the


measure should generate consistent results (reliability). This
consistency relates to two things:
• our ability to be consistent over time and
• our ability to be consistent with other researchers.

• To produce basis for more precise estimates of the degree of the


relationship between concepts.
Indicators of Concepts
• Produced by the operational definition of a concept.

• An indicator is something that is devised or already exists and that is


employed as though it were a measure of a concept.

• Used for concepts that are less directly quantifiable than measures
(measures: quantities that can be unambiguously counted: e.g. income, age)

• Multiple-indicator measures
• concept may have different dimensions
Indicators of Concepts
• Direct and indirect indicators of concepts.
• An indicator of marital status has a much
more direct relationship to its concept
than an indicator (or set of indicators)
relating to job satisfaction.
• Sets of attitudes, or behavior, always need
to be measured by batteries of indirect
indicators.
• When indicators are used that are not true
quantities, they will need to be coded for
quantification.
• Amount earned per month: a direct
measure of personal income. As an
indicator of social class, it becomes an
indirect measure.
Indicators of Concepts
• The issue of indirectness raises the question of where an indirect
measure comes from.
• That is, how does a researcher devise an indicator of something like job
satisfaction?
• Usually, it is based on common-sense understandings of the forms the
concept takes or on anecdotal or qualitative evidence relating to that
concept.
• through a question (or series of questions) that is part of a structured interview
schedule or self-completion questionnaire;
• through the recording of individuals’ behaviour using a structured observation
schedule;
• through official statistics;
• through an examination of mass media content through content analysis;
• ETC.
Why use more than one indicator?

• Single indicators may incorrectly classify many individuals

• Single indicators may capture only a portion of the underlying


concept or be too general

• Multiple indicators can make finer distinctions between individuals

• Multiple indicators can capture different dimensions of a concept


Example of Multiple-Indicator Concept: Measure of
Commitment to Employment
To find out whether respondents’ commitment to work varied according to whether they were still unemployed
at the time of the interview or had found work or had retired. Ten statements asked respondents to indicate
their level of agreement or disagreement on a seven-point scale running from ‘Yes, I strongly agree’ to ‘No, I
strongly disagree’. There was a middle point on the scale that allowed for a neutral response. This approach to
investigating a cluster of attitudes is known as a Likert scale, though in many cases researchers use a five-point
rather than a seven-point scale for responses.

1. Work is necessary, but rarely enjoyable.


2. Having a job is not very important to me.
3. I regard time spent at work as time taken away from the things I want to do.
4. Having a job is/was important to me only because it brings in money.
5. Even if I won a great deal of money on the pools I’d carry on working.
6. If unemployment benefit were really high, I would still prefer to work.
7. I would hate being on the dole.
8. I would soon get bored if I did not go out to work.
9. The most important things that have happened to me involved work.
10. Any feelings I’ve had in the past of achieving something worthwhile have usually come through things I’ve done at work.
Dimensions of Concepts

• Different aspects or components of a concept


• For example, 3 dimensions commonly used in social sciences research:
• Cognitive: logical reasoning about concept
• Emotive: emotional involvement with concept
• Conative: behavioral outcome
• Brand loyalty:
• Cognitive: performs better than its competitors
• Emotive: emotional attachment to brand
• Conative: will continue to buy even with certain price increases
Parts of a Theory: 3) Relationships
• Theories tell us whether the concepts are connected to one another,
and, if so, how.
• Kinds of relationships

• Positive
• Negative

• Interaction effect: one concept accelerates or decelerates/diminishes the other


or that its impact is immediate or delayed (moderation).

• Contingent relationship: one concept relates to another but only under certain
conditions (full/partial mediation).
Parts of a Theory: 3) Relationships

• Propositions: A theoretical statement about the relationship between


two or more concepts. It is a belief that may or may not have been
tested.

• Hypothesis: An empirically testable version of a theoretical proposition


that has not yet been tested or verified with empirical evidence. It is
most used in deductive theorizing and can be restated as a prediction.
Diagrams of Causal Relationships

Positive relationship

Positive and negative relationship

Positive path relationship


Diagrams of Causal Relationships
Complex relationships
Parts of a Theory: 4) Units of Analysis

• Individual
• Group
• Organization
• Region
• Country
Conceptualization and Operationalization
Example of the Deductive Measurement Process for the Hypothesis: A Professional Work Environment Increases
the Level of Teacher Morale
What does Reliability mean?

Reliability: the quality of being trustworthy or of performing consistently


well. T T 1 2

• Stability Obs Obs 1 2

There should be high correlation between Obs1 and Obs2


• is the measure stable over time?
• e.g. test–retest method: same sample, same test, different times

• Internal reliability (multiple-indicator concept)


• are the indicators that make up the scale or index consistent (coherent)?
• e.g. split-half method
• Cronbach’s alpha All indicators should relate to each other and not to
another indicator.
• Inter-observer consistency
• is the measure consistent between observers? Important to measure when more
than one observer is involved in judgement When multiple observers are used. Do all
observers agree on same measurement?
What does (Measurement) Validity mean?
Does the indicator measure the concept?
It does if it has:
• Face validity (right for the concept? At the very minimum)
• Concurrent validity (supported by a relevant criterion today?)
• Predictive validity (likely to be supported by a relevant criterion tomorrow?)
• Construct validity (are useful hypotheses produced?)
• Convergent validity (supported by results from other methods?
Reliability and Validity in PLS-SEM
• Loadings – inter-item reliability
• Cronbach’s Alpha – construct reliability
• Composite reliability – construct validity
• Rho_a (ρA) – composite reliability
• AVE – construct validity
• HTMT – discriminant validity
Causality
• Explanation
• why things are the way they are

• Direction of causal influence


• relationship between dependent & independent variables

• Confidence
• in the researcher's causal inferences
Generalization

• Can findings be generalized beyond the confines of the


particular context?
• Can findings be generalized from sample to population?
• How representative are samples?
Replication

• Minimizing contamination from researcher biases or values

• Explicit description of procedures

• Control of conditions of study

• Ability to replicate in differing contexts


Replicating a Study of Tipping

• Brewster and Lynn (2014) explored customer racial discrimination


in tipping
• Replicated previous study by Lynn which found black restaurant
workers were tipped less
• Limitations of previous study – a measure of service quality
• New study – service quality not a mediating variable – race of the
diner was not significant in tipping behaviour
• The findings were similar
Criticisms of Quantitative Research
• Failure to distinguish between objects in the natural world and social
phenomena
• Artificial and spurious sense of precision and accuracy
• Lack of ecological validity
• reliance on instruments and measurements
• Static view of social life
Is it always like this?
• Quantitative research design is an ideal-typical approach
• Useful as a guide of good practice
• But discrepancy between ideal type and actual practice of social
research
• Pragmatic concerns mean that researchers may not adhere rigidly to
these principles
One reason for the discrepancy between the ideal
and typical approaches
• Quantitative research is usually deductive (operational definition of
concepts)
• But measurements can sometimes lead to inductive theorising
• And this means the factors give rise to the concepts, rather than
making them operational
• Bryman (1988:28) calls this ‘reverse operationism’
…and another reason
• Published accounts of quantitative research rarely report evidence of
reliability and validity (Podsakoff & Dalton, 1987)
• Researchers are primarily interested in the substantive content and
findings of their research
• Running tests of reliability and validity may seem an unappealing
alternative!
• But researchers remain committed to the principles of good practice

You might also like