Professional Documents
Culture Documents
Lecture 4 Research Design 2
Lecture 4 Research Design 2
Research Design
• Blue Print of a Study
• It defines:
• The study type (descriptive, correlational, semi-experimental,
experimental, review, meta-analytic……) and sub-type (e.g.,
descriptive-longitudinal case study).
• Research question, hypotheses, independent and dependent
variables, experimental design.
• Data collection methods and a statistical analysis plan.
• Design Science Research?
Research Design
• “Research design is the plan, structure and strategy of investigation
conceived so as to obtain answers to research questions and to
control variance.”…. Kerlinger
• “A research is the specification of methods and procedures for
acquiring the information needed. It is the overall operational pattern
or framework of the project that stipulates what information is to be
collected from which sources by what procedures.”…. Green and Tull
Need for Research Design
• Abstracts from the large number of complex number of decisions
researchers have to make.
Correlational
Single
Quantitative
subject
Descriptive
Meta-
analysis
Phenomenological
Interpretive
Understanding a Ethnographic
situation from the
participant perspective
Grounded Theory
Qualitative
Action Research
Critical
Understanding and
critiquing power within
society
Dialectics
QUAL-quant
Mixed
QUANT-qual
Methods
QUAL-QUANT
Issues in Research Design
•Nature of sample
•Random vs. stratified
•Convenience sample
•Size of sample determines
•Generalizability of results
•Ability to detect a true effect
•Validity
Reliability
• Consistent in producing the same results every time the measure is used.
• Observed Score = true score + systematic error +
random error
• Observed Scores are the data gathered by the researcher
• True Scores are the actual unknown values that correspond to the
construct of interest
• Systematic Error is variations that results from constructs of disinterest
• Random Error is nonsystematic variations in the observed scores
Reliability
• Reading Assignment: “Difference between random error and
systematic error?”
• Test-retest
• Internal Consistency
• Split-half
• Cronbach’s alpha: average of all possible split-half reliabilities
Observed Score =
Systematic Error
True Score Random Error
More Reliable:
Less Reliable:
Factors that increase reliability
• Number of items
• High variation among individuals being tested
• Clear instructions
• Optimal testing situation
Validity
• Validity is the extent to which a concept, conclusion or measurement
is well-founded and likely corresponds accurately to the real world
Semantic Web
(CS)
activities
[Owen,1997]
Design research basics
• Process model
• Artifact types:
• result of the research work
• Artifact structure
• content of the research approach
• Evaluation:
• evaluation criteria
• evaluation approach
Process model
• a problem-solving paradigm:
• seeks to create innovations that define the ideas, practices, technical
capabilities, and products through which the analysis, design,
implementation, and use of information systems can be effectively and
efficiently accomplished [Tsichritzis 1997; Denning 1997]
Design research process
knowledge
flows + operation and goal knowledge
circumscription
process
steps
Awareness of
Suggestion Development Evaluation Conclusion
problem
logical
formalism
abduction deduction
[Takeda,1990]
Artifacts
• are not exempt from natural laws or behavioral theories
• artifacts rely on existing "kernel theories" that are applied, tested,
modified, and extended through the experience, creativity, intuition,
and problem solving capabilities of the researcher [Walls et al. 1992;
Markus et al. 2002]
Design research outputs
[March & Smith, 1995]
• Constructs
• conceptual vocabulary of a problem/solution domain
• Methods
• algorithms and practices to perform a specific task
• Models
• a set of propositions or statements expressing relationships among
constructs
• abstractions and representations
u t
• Instantiations
t p
• constitute the realization of constructs, models and methods in a
o u
working system
s i s
• implemented and prototype systems
he
• Better theories T
• artifact construction
Design research outputs
constructs
better theories
emergent theory about models
embedded phenomena
abstraction models
abstraction methods
knowledge as constructs
operational principles
better theories
abstraction
e s
as
t c
e s
T
Evaluation approach
• Evaluation approach
• the procedure how to practically test an artifact
• defines all roles concerned with the assessment and the
way of handling the evaluation
• result is a decision whether or not the artifact meets the
evaluation criteria based on the available information.
od
et h
g m
ti n
s
Te
Evaluation approach (2)
• Quantative evaluation:
• originally developed in the natural sciences to study natural phenomena
• approaches:
• survey methods
• laboratory experiments
• formal methods (e.g. econometrics)
• numerical methods (e.g. mathematical modeling)
Evaluation approach (3)
• Qualitative evaluation:
• developed in the social sciences to enable researchers to
study social and cultural phenomena
• approaches:
• action research
• case study research
• ethnography
• grounded theory
• qualitative data sources:
• observation and participant observation (fieldwork)
• interviews and questionnaires
• documents and texts
• the researcher’s impressions and reactions
Constructs
Structure Evaluation criteria Evaluation approach