You are on page 1of 47

Lecture 4 Research Design

Research Design
• Blue Print of a Study
• It defines:
• The study type (descriptive, correlational, semi-experimental,
experimental, review, meta-analytic……) and sub-type (e.g.,
descriptive-longitudinal case study).
• Research question, hypotheses, independent and dependent
variables, experimental design.
• Data collection methods and a statistical analysis plan.
• Design Science Research?
Research Design
• “Research design is the plan, structure and strategy of investigation
conceived so as to obtain answers to research questions and to
control variance.”…. Kerlinger
• “A research is the specification of methods and procedures for
acquiring the information needed. It is the overall operational pattern
or framework of the project that stipulates what information is to be
collected from which sources by what procedures.”…. Green and Tull
Need for Research Design
• Abstracts from the large number of complex number of decisions
researchers have to make.

• Helps us to prepare for logistical and Administration resources.

• Makes Research more efficient.

• Impacts on the reliability of the Study


Research Design Consists
• A statement of objectives of the study or the research output.
• A statement of the data inputs required on the basis of which the
research problem is to be solved.
• The methods of analysis which shall be used to treat and analyze the
data inputs.
Issues to be addressed in Research Design

• What is the study about?


• Why is the study being made?
• Where will the study be carried out?
• What type of data is required?
• Where can the required data be found?
Issues to be addressed in Research Design
• What periods of time will the study include?
• What will be the sample design?
• What techniques of data collection will be used?
• How will the data be analyzed?
• In what style will the report be prepared?
Concepts applied in Research Design
• Variable(Nominal, Ordinal, Interval, Ratio ).
• Dependent and independent variables
• Extraneous variable.
• Control
• Moderating Variable(Reading Assignment)
• Research hypothesis
Concepts applied in Research Design
• Experimental and non-experimental hypothesis-testing research
• Experimental and control groups
• Experiment
• Experimental unit(s)
• Unit of Analysis
Types of Research
Quasi-
Experimental
experimental

Correlational

Single
Quantitative
subject

Descriptive

Meta-
analysis
Phenomenological

Interpretive
Understanding a Ethnographic
situation from the
participant perspective

Grounded Theory
Qualitative

Action Research
Critical
Understanding and
critiquing power within
society
Dialectics
QUAL-quant

Mixed
QUANT-qual
Methods

QUAL-QUANT
Issues in Research Design
•Nature of sample
•Random vs. stratified
•Convenience sample
•Size of sample determines
•Generalizability of results
•Ability to detect a true effect
•Validity
Reliability
• Consistent in producing the same results every time the measure is used.
• Observed Score = true score + systematic error +
random error
• Observed Scores are the data gathered by the researcher
• True Scores are the actual unknown values that correspond to the
construct of interest
• Systematic Error is variations that results from constructs of disinterest
• Random Error is nonsystematic variations in the observed scores
Reliability
• Reading Assignment: “Difference between random error and
systematic error?”
• Test-retest
• Internal Consistency
• Split-half
• Cronbach’s alpha: average of all possible split-half reliabilities
Observed Score =
Systematic Error
True Score Random Error

More Reliable:

Less Reliable:
Factors that increase reliability
• Number of items
• High variation among individuals being tested
• Clear instructions
• Optimal testing situation
Validity
• Validity is the extent to which a concept, conclusion or measurement
is well-founded and likely corresponds accurately to the real world

• It is an indication of how sound the research is.

• Applies to both the design and methods of the research.

• Are you really measuring what you are intending to measure?


Validity
• There are different types of Validity.

• Construct Validity, Content Validity, Face Validity, Criterion Validity,


Concurrent Validity, etc..

• A reliable result is not always a valid result.


Types of Validity
• Construct validity:

• The extent to which operationalizations of a construct (e.g., practical tests developed


from a theory) measure a construct as defined by a theory.
• Extent to which hypotheses about construct are supported by data.
• Define construct, generate hypotheses about construct’s relation to other constructs
• Develop comprehensive measure of construct & assess its reliability
• Examine relationship of measure of construct to other, similar and dissimilar
constructs
• E.g. Height & Weight; Networking & career outcomes study
Construct validity
• E.g. The extent a test measures usability of a new UI for an
information system.

• Contains Convergent Validity and Content Validity.

• There should be an empirical and theoretical support for the


interpretation of the construct.
Content Validity
• The degree to which the content of the test matches a content
domain associated with the construct.

• The extent to which a measure “covers” the construct of interest.

• Content validity is most often measured by relying on the knowledge


of people who are familiar with the construct being measured.
• E.g. Measuring the error rate in Network.
Internal Vs External Validity
• Internal Validity—correctness of conclusions regarding the
relationships among variables examined
• E.g. Whether the research findings accurately reflect how the
research variables are really connected to each other.

• External Validity –Generalizability of the findings to the


intended/appropriate population/setting
• E.g. Whether appropriate subjects were selected for conducting the
study
Research Methods
• Methods/techniques that are used for conduction of research.

• Research Methods and techniques are sometimes viewed differently.

• Methods are more general and generate techniques.


Design Science Research Methodology
• Motivation
• Types of research
• Design Research Basics
• Evaluation in Design Research
• Conclusion
Motivation
• Motivation for research:
• pure research: enhance understanding of phenomena
• instrumentalist research: a problem needs a solution
• applied research: a solution needs application fields

• Motivation for research methodology


• (qualitatively) control research process
• validate research results
• compare research approaches
• respect rules of good scientific practice
Research: A Definition
• Research:
• an activity that contributes to the understanding of a
phenomenon [Kuhn, 1962; Lakatos, 1978]
• phenomenon: a set of behaviors of some entity(ies) that
is found interesting by a research community
• understanding: knowledge that allows prediction of the
behavior of some aspect of the phenomenon
• activities considered appropriate to the production of
understanding (knowledge) are the research methods
and techniques of a research community
• paradigmatic vs multi-paradigmatic communities
(agreement on phenomena of interest and research
methods)
Scientific Disciplines
• Types of research [Simon, 1996]:
• natural sciences: phenomena occurring in the world (nature or society)
• design sciences ~ sciences of the artificial:
• all or part of the phenomena may be created artificially
• studies artificial objects or phenomena designed to meet certain goals
• social sciences: structural level processes of a social system and its impact on social
processes and social organization
• behavioural sciences: the decision processes and communication strategies within
and between organisms in a social system
phenomena
design
sciences

Semantic Web
(CS)

activities

[Owen,1997]
Design research basics
• Process model
• Artifact types:
• result of the research work
• Artifact structure
• content of the research approach
• Evaluation:
• evaluation criteria
• evaluation approach
Process model
• a problem-solving paradigm:
• seeks to create innovations that define the ideas, practices, technical
capabilities, and products through which the analysis, design,
implementation, and use of information systems can be effectively and
efficiently accomplished [Tsichritzis 1997; Denning 1997]
Design research process

knowledge
flows + operation and goal knowledge

circumscription
process
steps
Awareness of
Suggestion Development Evaluation Conclusion
problem

logical
formalism
abduction deduction

[Takeda,1990]
Artifacts
• are not exempt from natural laws or behavioral theories
• artifacts rely on existing "kernel theories" that are applied, tested,
modified, and extended through the experience, creativity, intuition,
and problem solving capabilities of the researcher [Walls et al. 1992;
Markus et al. 2002]
Design research outputs
[March & Smith, 1995]
• Constructs
• conceptual vocabulary of a problem/solution domain
• Methods
• algorithms and practices to perform a specific task
• Models
• a set of propositions or statements expressing relationships among
constructs
• abstractions and representations
u t
• Instantiations
t p
• constitute the realization of constructs, models and methods in a
o u
working system
s i s
• implemented and prototype systems
he
• Better theories T
• artifact construction
Design research outputs
constructs
better theories
emergent theory about models
embedded phenomena
abstraction models
abstraction methods
knowledge as constructs
operational principles
better theories

abstraction

artifact as situated implementation instatiations


methods
constructs
[Purao , 2002]
Examples
• Open up a new area
• Provide a unifying framework
• Resolve a long-standing question
• Thoroughly explore an area
• Contradict existing knowledge
• Experimentally validate a theory
• Produce an ambitious system
• Provide empirical data
• Derive superior algorithms
• Develop new methodology
• Develop a new tool
• Produce a negative result
Artifact structure
• Structure of the artifact
• the information space the artifact spans
• basis for deducing all required information about the
artifact
• determines the configurational characteristics necessary is
to enable the evaluation of the artifact e s
t h
h e
f t
t o
ten
on
C
Evaluation criteria
• Evaluation criteria

• the dimensions of the information space which are


relevant for determining the utility of the artifact

e s
as
t c
e s
T
Evaluation approach
• Evaluation approach
• the procedure how to practically test an artifact
• defines all roles concerned with the assessment and the
way of handling the evaluation
• result is a decision whether or not the artifact meets the
evaluation criteria based on the available information.
od
et h
g m
ti n
s
Te
Evaluation approach (2)
• Quantative evaluation:
• originally developed in the natural sciences to study natural phenomena
• approaches:
• survey methods
• laboratory experiments
• formal methods (e.g. econometrics)
• numerical methods (e.g. mathematical modeling)
Evaluation approach (3)
• Qualitative evaluation:
• developed in the social sciences to enable researchers to
study social and cultural phenomena
• approaches:
• action research
• case study research
• ethnography
• grounded theory
• qualitative data sources:
• observation and participant observation (fieldwork)
• interviews and questionnaires
• documents and texts
• the researcher’s impressions and reactions
Constructs
Structure Evaluation criteria Evaluation approach

meta-model of the construct deficit ontological analysis


vocabulary construct overload
construct redundancy
construct excess
Methods
Structure Evaluation criteria Evaluation approach

process-based meta model appropriateness laboratory research


intended applications completeness field inquiries
conditions of applicability consistency surveys
products and results of the case studies
method application action research
reference to constructs practice descriptions
interpretative research
Models
Structure Evaluation criteria Evaluation approach
 domain correcteness syntactical validation
 scope, purpose completeness integrity checking
 syntax and semantics clarity sampling using selective matching
 terminology flexibility of data to actual external
 intended application simplicity phenomena or trusted surrogate
integration tests
applicability
risk and cost analysis
implementability
user surveys
Instantiations
Structure Evaluation criteria Evaluation approach
executable implementation functionality code inspection
in a programming language usability testing
reference to a design model
reliability code analysis
reference to a requirement
specification performance verification

reference to the supportability


documentation
reference to quality
management documents
reference to configuration
management documents
reference to project
management documents
Conclusion

Good research results require a careful


design of the research methodology and
considerable evaluation efforts

You might also like