You are on page 1of 86

CH1: INTRODUCTION

The scientist has no other method than doing his damnedest.P.W. Bridgman, Reflections of a Physicist. All progress is born of enquiry. Doubt is often better than over confidence, for it leads to inquiry and inquiry leads to invention.
DEPT. OF IEM, DSCE, B'LORE-78 1

Plan of presentation
1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. Meaning of research. Objectives of research. Motivation in research. Types of research. Research approaches. Significance of research. Research methods V/s research methodology. Research and scientific method. Importance of knowing how research is done. Research process. Criteria for good research. Problems encountered by researchers in India.
DEPT. OF IEM, DSCE, B'LORE-78 2

Meaning of research.
1. 2. 3. 4. 5. 6. A careful investigation or inquiry specially through search for new facts in any branch of knowledge. Redman and Mory define research as a systematized effort to gain new knowledge. Research is an original contribution to existing stock of knowledge. Inquisitiveness is the mother of knowledge. Pursuit of truth with help of study, observation, comparison and experiment. Systematic approach concerning generalization and the formulation of the theory is also research.
DEPT. OF IEM, DSCE, B'LORE-78 3

Objectives of research.
1. To gain familiarity in to phenomenon or to achieve new insights into it.( Exploratory or formulative research) 2. To portray accurately the characteristics of a particular individual, situation or a group (Descriptive research) 3. To determine the frequency with which something occurs or with which it is associated with something else.(Diagnostic) 4. To test a hypotheses of causal relationships between variables.
DEPT. OF IEM, DSCE, B'LORE-78 4

Motivation in research
1. Desire to get a research degree along with its consequential benefits. 2. Desire to face the challenge in solving unsolved problems, concern over practical problems initiates research. 3. Desire to get intellectual joy of doing some creative work. 4. Desire to be of service to society. 5. Desire to get respectability.
DEPT. OF IEM, DSCE, B'LORE-78 5

Types of research.
1. Descriptive (Ex post facto research)V/s Analytical. 2. Applied V/s Fundamental.(Basic) 3. Quantitative Vs Qualitative. 4. Conceptual V/s Empirical. 5. Other types: Field/Lab research, Clinical/diagnostic research, Exploratory, conclusive, Historical research.
DEPT. OF IEM, DSCE, B'LORE-78 6

Research approaches.
Quantitative : Experimental, Simulation inferential Qualitative: Focus group interviews, projective techniques, depth interviews, Ethnographic, Phenomenological, field research.

DEPT. OF IEM, DSCE, B'LORE-78

Significance of research.
Research inculcates scientific and inductive thinking and it promotes the development of logical habits of thinking and organization. The role of research in several fields of applied economics, whether related to business or to the economy as a whole has greatly increased in modern times. Research provides the basis for all nearly all government policies in our economic system.

DEPT. OF IEM, DSCE, B'LORE-78

Research methods Vs Methodology.


When we talk of research methodology we not only talk of research methods but also consider the logic behind the methods we use in the context of our research study and explain why we are using a particular method or technique and why we are not using others so that research results capable of being evaluated either by the researcher himself or by others.
DEPT. OF IEM, DSCE, B'LORE-78 9

Whats the Difference Between Method and Methodology?


Method: Techniques for gathering evidence The various ways of proceeding in gathering information Methodology: The underlying theory and analysis of how research does or should proceed, often influenced by discipline

DEPT. OF IEM, DSCE, B'LORE-78

10

Research and Scientific method.


1. 2. 3. 4. 5. 6. Scientific method is based on Empirical evidence. Utilizes relevant concepts. Committed to only objective considerations. Presupposes ethical neutrality. Results in to probabilistic predictions. Methodology is made known to all concerned for critical scrutiny 7. Aims at formulating most general axioms or scientific theories.
DEPT. OF IEM, DSCE, B'LORE-78 11

Importance of knowing how research is done.


Researcher- tools of his trade, bent of mind. Inculcate the ability to evaluate and use research results with reasonable confidence. Satisfaction of acquiring a new intellectual tool. All are consumers of research results.
DEPT. OF IEM, DSCE, B'LORE-78 12

Scope of Research
Finance, budgeting & Investments. Purchasing, procurement and exploration. Production Management : Physical distribution, facility planning, Manufacturing planning. Research & development.

DEPT. OF IEM, DSCE, B'LORE-78

13

Research in Management & Business


Marketing Research Product research, Market characteristics research, competitive positions and trends research, advertising & promotion research, New product launch research, and product positioning, Sales research Distribution research Size of market, Customer satisfaction research
DEPT. OF IEM, DSCE, B'LORE-78 14

Significance of Marketing research


Sales forecasting helps in preparation of MPS and MRP Efficient budgetary control. Bridges gap between consumers and manufacturers Market and customers information. Development of marketing strategy.
DEPT. OF IEM, DSCE, B'LORE-78 15

Social science research


Helps Nation in exploring policies on Social and economic structures. Social attitudes. Social values and behavior. Factors motivating individuals and groups of a society.

DEPT. OF IEM, DSCE, B'LORE-78

16

Use of computers in research


SPSS, Wincross, GMI research analyser, Market sight, Memphis market intelligence survey explorer. AMOS, Hierarchical Linear Modeling (HLM), SAS, MatLab, Maple. MiniTab, SysStat, ARENA

DEPT. OF IEM, DSCE, B'LORE-78

17

Research Planning
Designing a Research plan: Identifying the need for research. Selecting the research method. Collecting data. Analyzing the collected data Documenting the analyzed data.
DEPT. OF IEM, DSCE, B'LORE-78 18

Define research problem

Review concepts & theories

Review Previous research findings

Formulate Hypotheses

Design research including sample design

F
F F
Collect data

RESEARCH PROCESS
Analyze data

Interpret & report


DEPT. OF IEM, DSCE, B'LORE-78 19

Research process
1. 2. 3. 4. 5. Formulating the research problem Extensive literature survey. Development of working Hypotheses. Preparing the research design. Determining sample design: Deliberate (convenience & judgmental sampling), simple random sampling, systematic sampling, stratified sampling, quota sampling, cluster sampling and area sampling, Multi stage sampling, sequential sampling. Collect the data: observation, personal interview, telephone interview, mailing of questionnaires, schedules. Analysis of data: Coding, editing & tabulation. Hypothesis testing. Generalizations and interpretations. Preparation of the report or the thesis.
DEPT. OF IEM, DSCE, B'LORE-78 20

6. 7. 8. 9. 10.

Nature of research.
Criteria for good research: Systematic Logical Empirical. Objectivity. Control. Universality Free from personal biases. Reproductivity/ Replicable.
DEPT. OF IEM, DSCE, B'LORE-78 21

Problems encountered by researchers in India.


1. 2. 3. 4. 5. 6. 7. 8. The lack of scientific training in the methodology of research. Insufficient interaction between academia and industry. There is a need for generating the confidence that the information /data obtained from a business unit will not be misused. Research studies overlapping one another are undertaken quite often for want of adequate information. There does not exist a code of conduct for researchers. Difficulty of adequate and timely secretarial assistance. Library management and functioning is not satisfactory in at many places. There is also the problem that many of our libraries are not able to get copies of old and new Acts/rules, reports and other government publications in time. There is also the difficulty of timely availability of published data. The problem of conceptualization.
DEPT. OF IEM, DSCE, B'LORE-78 22

9. 10.

Defining the Research Problem.


Components: There must be an individual or group which has some difficulty or the problem. There must be some objectives to be attained at. If one wants nothing, one cannot have a problem. There must be alternative means or courses of action for obtaining the objective one wishes to attain.This means that there must be at least two means available to a researcher for if has no choice of means, he cannot have a problem. There must remain some doubt in the mind of a researcher with regard to the selection of alternatives.This means that research must answer the question concerning the relative efficiency of the possible alternatives. There must be some environments to which the difficulty pertains.
DEPT. OF IEM, DSCE, B'LORE-78

23

SELECTING THE PROBLEM

1. 2. 3. 4.

Points to be noted:
Subject which is overdone should not be normally chosen, for it will be a difficult task to throw any new light in such a case. Controversial subject should not become the choice of an average researcher. Too narrow or too vague problems to be avoided. The subject selected for research should be familiar and feasible so that the related research material or sources of research are within ones reach. Importance of subject, qualifications and training of researcher, costs involved and time. Selection of the problem must be preceded by a preliminary study.

5. 6.

DEPT. OF IEM, DSCE, B'LORE-78

24

NECESSITY OF DEFINING THE PROBLEM.


Problem clearly stated is half solved. A proper definition of the problem will enable the researcher to be on the track. What, When, Where, Why questions can be answered very well if research problem is well defined.

DEPT. OF IEM, DSCE, B'LORE-78

25

Technique involved in defining the problem

1. Statement of the problem in a general way: Preliminary/pilot survey. 2. Understanding the nature of the problem. 3. Surveying the available literature. 4. Developing the ideas through discussions: Experience survey. 5. Rephrasing the research problem: Working hypotheses.
DEPT. OF IEM, DSCE, B'LORE-78 26

Research design
A research design is the arrangement of conditions for collection and analysis of data in a manner that aims to combine relevance to the research purpose with economy in procedure.

DEPT. OF IEM, DSCE, B'LORE-78

27

Research design questions


1. 2. 3. 4. 5. 6. 7. 8. 9. 10. What is the study about? Why is the study being made? Where will the study be carried out? What type of data is required? Where can the required data be found? What periods of time will the study include? What will be the sample design? What techniques of data collection will be used? How will the data be analyzed? In what style the report be prepared?
DEPT. OF IEM, DSCE, B'LORE-78 28

Parts of research design


1. The sampling design. 2. The observational design. 3. The statistical design. 4. The operational design. Features: It is a plan. It is a strategy. Includes time and cost budgets. Need for research design: It is a map, blue print
DEPT. OF IEM, DSCE, B'LORE-78 29

Need for research design


Helps in smooth operations of various research operations. Requires less effort time and money. Helps in deciding methods and techniques for analyzing data. Factors to be considered: Sources of information. Skills of researcher & staff. Problem objectives. Nature of Problem. Availability of time & money.
DEPT. OF IEM, DSCE, B'LORE-78 30

Features of good design.


Flexible, appropriate, efficient, economical. Factors to be considered: 1. The means of obtaining information. 2. Availability and skills of the researcher and his staff. 3. Objective of the problem to be studied. 4. Nature of the problem. 5. Availability of time and money.
DEPT. OF IEM, DSCE, B'LORE-78 31

Important concepts related to research design.


1. 2. 3. 4. 5. 6. Dependent & independent variables. Extraneous variable. Control. Confounded relationship. Research hypotheses. Experimental & Non-experimental hypothesis testing research. 7. Experimental & control groups. 8. Treatments. 9. Experiment. 10. Experimental Units.
DEPT. OF IEM, DSCE, B'LORE-78 32

DIFFERENT RESEARCH DESIGNS


Research design in case of exploratory research studies:
1. 2. 3. Survey of concerning literature. Experience Survey. Analysis of insight stimulating examples.

Research design in case of descriptive and diagnostic research studies. (Survey design)
1. 2. 3. 4. 5. 6. Formulating the objectives of study: Designing the methods of data collection. Selecting the sample. Collecting the data. Processing and analyzing the data. Reporting the findings.

Research studies in case of Hypothesis-testing research studies. (Prof. R.A.Fischer)


DEPT. OF IEM, DSCE, B'LORE-78 33

Type of Study Research Design Overall Design Sampling Design. Exploratory/Formulative Flexible design Non-Probability sampling. No-Preplanned Unstructured instruments for collecting data No fixed decisions Descriptive/Diagnostic Rigid design. Probability sampling

Statistical design.
Observational design. Operational design.

Pre-planned
Structured

Advanced decisions.

DEPT. OF IEM, DSCE, B'LORE-78

34

BASIC PRINCIPLES OF EXPERIMENTAL DESIGNS.


1. The principle of Replication. 2. Principle of randomization. 3. Principle of Locus of Control.

DEPT. OF IEM, DSCE, B'LORE-78

35

Important Experimental designs.


Informal experimental designs. Before-and-after without control design. After-only with control design. Before-and-after with control design. Formal experimental design. Completely randomized design. (CR design) Randomized block design. (RB design) Latin square design. (LS design) Factorial designs.
DEPT. OF IEM, DSCE, B'LORE-78 36

Before and after without control design.


Test area Level of Phenomenon before treatment

Treatment
introduced. Level of Phenomenon after treatment Y

Treatment effect =Y-X

DEPT. OF IEM, DSCE, B'LORE-78

37

After only with control design.


Test area Treatment introduced
Level of Phenomenon after treatment(Y)

Control area

Level of Phenomenon without treatment(Z) Treatment effect =Y-Z

DEPT. OF IEM, DSCE, B'LORE-78

38

Before and after with control design


Time period 1. Time period 2.

Test area.

Level of Phenomenon before treatment(X)

Treatment
introduced.

Level of Phenomenon after treatment.(Y)

Control area

Level of Phenomenon without treatment(A) Treatment Effect= (Y-X)-(Z-A)

Level of Phenomenon without treatment.(Z)

DEPT. OF IEM, DSCE, B'LORE-78

39

Other Experimental designs.

Completely randomized design Two-group simple randomized design Random replication design Latin square design. Factorial designs.

DEPT. OF IEM, DSCE, B'LORE-78

40

Research Plan
Research objectives. Problem statement. Operational definition of the concepts. Methods to be adopted. Details of techniques to be adopted. Population to be studies. Methods for processing data. Results of Pilot tests conducted.
DEPT. OF IEM, DSCE, B'LORE-78 41

Benefits of Research plan


Help in organizing ideas in a form, look for flaws and inadequacies. Inventory of what must be done , materials to be collected. Document for review and comments.

DEPT. OF IEM, DSCE, B'LORE-78

42

CH:4 Sampling design


Census survey: All items in any field of inquiry constitute a Universe or Population. A complete enumeration of all items in the population is known as census inquiry. Sample survey: Selection of only few items , a miniature cross-section, a sample, sampling technique.

DEPT. OF IEM, DSCE, B'LORE-78

43

Sample design: Implications


A sample design is a definite plan for obtaining a sample from a given population. It refers to the technique or the procedure the researcher would adopt in selecting items for the sample. It also lays down the number of items to be selected for the sample. Sample design is determined before data are collected.
DEPT. OF IEM, DSCE, B'LORE-78 44

Steps in Sample design


1. 2. 3. 4. 5. 6. 7. Type of Universe. Sampling Unit. Source list. Size of sample. Parameters of interest. Budgetary constraint. Sampling Procedure.
DEPT. OF IEM, DSCE, B'LORE-78 45

Criteria of selecting a sampling procedure.


Systematic bias is caused by errors in sampling procedures. Factors causing systematic bias. Inappropriate sampling frame. Defective measuring device. Non-respondents. Indeterminacy principle. Natural bias in reporting of data.
DEPT. OF IEM, DSCE, B'LORE-78 46

Sampling errors
Sampling errors are random variation in sample estimates around the true population parameters. Sampling errors can be measured for a given sample design and size. The measurement of sampling error is usually called the precision of the sampling plan. While selecting a sampling procedure, the researcher must ensure that the procedure causes a relatively small sampling error and helps to control the systematic bias in a better way. DEPT. OF IEM, DSCE, B'LORE-78 47

Characteristics of a good sample design


1. result in a truly representative sample. 2. which results in a small sampling error. 3. viable in the context of funds available for research study. 4. systematic bias can be controlled in a better way. 5. The results of the sample study can be applied, in general for the universe with a reasonable level of confidence.
DEPT. OF IEM, DSCE, B'LORE-78 48

Sampling designs
Element selection technique Representation basis Probability sampling Non-Probability sampling

Unrestricted sampling
Restricted sampling

Simple random sampling


Complex random sampling (Cluster, systematic, stratified)

Haphazard sampling or convenience sampling


Purposive sampling (quota, judgment)

DEPT. OF IEM, DSCE, B'LORE-78

49

Measurement and scaling


Measurement: The assignment of numbers or other symbols to characteristics of objects according to certain pre-specified rules. Scaling: The generation of a continuum upon which measured objects are located.

DEPT. OF IEM, DSCE, B'LORE-78

50

Primary scales of measurement


1. Nominal scale: A scale whose numbers serve only as labels or tags for identifying and classifying objects with a strict on-to-one correspondence between the numbers and the objects.( Numbers assigned to runners) Ordinal scale: A ranking scale in which numbers are assigned to objects to indicate the relative extent to which some characteristic is possessed. Thus it is possible to determine whether an object has more or less of a characteristic than some other object. (Rank order of winners)
DEPT. OF IEM, DSCE, B'LORE-78 51

2.

3. Interval scale: A scale in which numbers are used to rate objects such that numerically equal distances on the scale represent equal distances in the characteristics being measured. (Performance rating on 0-10 scale) 4.Ratio scale: The highest scale. It allows the researcher to identify or classify objects, rank order the objects, and compare intervals or differences . It is also meaningful to compute ratios of scale values. (Time to finish in seconds)
DEPT. OF IEM, DSCE, B'LORE-78 52

Scaling techniques
1. Comparative scale: There is a direct comparison of stimulus objects with one another. 2. Non comparative scales: Each stimulus object is scaled independently of the other objects in the stimulus set.

DEPT. OF IEM, DSCE, B'LORE-78

53

Comparative scaling techniques


Paired comparison scaling: A respondent is presented with two objects at a time and asked to select one object in the pair according to some criterion. The data obtained are ordinal in nature. Transitivity of preference: An assumption made in order to convert paired comparison data into rank order data.
DEPT. OF IEM, DSCE, B'LORE-78 54

Rank order scaling: Respondents are presented with several objects simultaneously and asked to order or rank them according to some criterion. Constant sum scaling: respondent are required to allocate a constant sum of units among a set of stimulus objects with respect to some criterion. Q-sort scaling: uses a rank order procedure to sort objects based on similarity with respect to some criterion.
DEPT. OF IEM, DSCE, B'LORE-78 55

Non-comparative scaling techniques.


Continuous rating scale: Graphic rating scale; respondents rate the objects by placing a mark at the appropriate position on a line that runs from one extreme of the criterion variable to the other. Itemized rating scale: having numbers and/or brief descriptions associated with each category. The categories are ordered in terms of scale position.

DEPT. OF IEM, DSCE, B'LORE-78

56

Itemized rating scale


Likert scale: five response categories ranging from strongly disagree to strongly agree which requires the respondents to indicate a degree of agreement or disagreement with each of the series of statements related to stimulus objects. Semantic differential: a 7 point rating scale with end points associated with bipolar labels that have semantic meaning. Stapel scale: measuring attitudes that consists of a single adjective in the middle of an even numbered range of values, from -5 to +5 without a neutral point. (zero)
DEPT. OF IEM, DSCE, B'LORE-78 57

Non- comparative itemized rating scale decisions. 1. 2. 3. 4. 5. The number of scale categories to use. Balanced versus unbalanced scale. Odd or even number of categories. Forced versus unforced choices. The nature and degree of verbal descriptions. 6. The physical form or configuration of scale.
DEPT. OF IEM, DSCE, B'LORE-78 58

Multi item scaling (MDS)


1. 2. 3. Develop a theory. Generate an initial pool of items: Theory, secondary data and qualitative research. Select a reduced set of items based on qualitative judgment. Collect data from a large pretest sample. Perform statistical analysis. Develop a purified scale. Collect more data from a different sample. Evaluate scale reliability, validity and generalizability. Prepare the final scale.
DEPT. OF IEM, DSCE, B'LORE-78 59

4. 5. 6. 7. 8. 9.

Scale evaluation
1. Reliability: Test/Retest, Alternative forms, Internal consistency. 2. Validity: Content, Criterion Construct: convergent, discriminant and Nomological. 3. Generalizability.
DEPT. OF IEM, DSCE, B'LORE-78 60

Reliability
Reliability: The extent to which a scale produces consistent results if repeated measurements are made on the characteristic. Test-retest: An approach for assessing reliability in which respondents are administered identical sets of scale items at two different times under as nearly equivalent conditions as possible. Alternative forms reliability: An approach for assessing reliability that requires to equivalent forms of the scale to be constructed and then the same respondents are measured at two different times.
DEPT. OF IEM, DSCE, B'LORE-78 61

Reliability
Internal consistency reliability: An approach for assessing the internal consistency of the set of items when several items are summated in order to form a total score for the scale. Spit-half reliability: A form of internal consistency reliability in which the items constituting the scale are divided into two halves and resulting half scores are correlated. Coefficient alpha: a measure of internal consistency reliability that is the average of all possible split-half coefficients resulting from different splitting of scale items.
DEPT. OF IEM, DSCE, B'LORE-78 62

Validity
Validity: The extent to which the differences in observed scale scores reflect true differences among objects on the characteristic being measured, rather than systematic or random errors. Content validity: A type of validity sometimes called face validity, that consists of a subjective but systematic evaluation of the representativeness of the content of a scale for the measuring task at hand. Criterion validity: A type of validity that examines whether the measurement scale performs as expected in relation to other variables selected as meaningful criteria.
DEPT. OF IEM, DSCE, B'LORE-78 63

Validity
Construct validity: A type of validity that addresses the question of what construct or characteristic the scale is measuring. An attempt is made to answer theoretical questions of why a scale works and what deductions can be made concerning the theory underlying the scale. Convergent validity: A measure of construct validity that measures the extent to which the scale correlates positively with other measures of the same construct.
DEPT. OF IEM, DSCE, B'LORE-78 64

Validity
Discriminant validity: A type of construct validity that assesses the extent to which a measure does not correlate with other constructs from which it is supposed to differ. Nomological validity: A type of validity that assesses the relationship between theretical constructs. It seeks to confirm significant correlations between the constructs as prescribed by theory.
DEPT. OF IEM, DSCE, B'LORE-78 65

Generalizability/Practicality
The degree to which a study based on sample applies to a universe of generalizations.

DEPT. OF IEM, DSCE, B'LORE-78

66

DATA COLLECTION
Primary Data & Secondary data Secondary data is a ready made data, useful for research topic of interest. Primary data is generated/collected by the researchers for the purpose of investigation. Collection of primary data: Observation method, Interview method, Questionnaires, Schedules, Survey methods etc.

DEPT. OF IEM, DSCE, B'LORE-78

67

OBSERVATION METHOD
What should be observed?, How should the observation be recorded?, how can the accuracy f observation be ensured? Types: Structured, Unstructured, participant, Non-participant, Disguised, Controlled, & Uncontrolled observation. Limitations: Expensive, provides limited information, affected by unwanted factors.
DEPT. OF IEM, DSCE, B'LORE-78 68

INTERVIEW METHOD
Presentation of oral & verbal stimuli and reply in terms of oral & verbal responses. Personal interview: involves two people. Structured: Eg. What is the main function of your production department? Unstructured: Eg. How would you evaluate the benefits of new machinery that is installed in your production department?
DEPT. OF IEM, DSCE, B'LORE-78 69

Questionnaire and form design


Questionnaire: A structured technique for data collection that consists of a series of questions, written or verbal, that a respondent answers. Mail, Telephone, Personal, Electronic questionnaires.

DEPT. OF IEM, DSCE, B'LORE-78

70

Questionnaire design process


1. 2. 3. 4. Specify the information need. Specify the type of interviewing method. Determine the content of individual questions. Design the questions to overcome the respondents inability and unwillingness to answer. 5. Decide on question structure. 6. Determine the question wording. 7. Arrange the questions in proper order. 8. Identify the form and layout. 9. Reproduce the questionnaire. 10. Eliminate bugs by pre-testing.
DEPT. OF IEM, DSCE, B'LORE-78 71

Individual question content: Is question necessary?, Are several questions needed instead of one? Double barreled questions: A single question that attempts to cover two issues. Such questions can be confusing to respondents and result in ambiguous responses. Overcoming inability to answer: is the respondent informed?, Filter question: An initial question in a questionnaire that screens potential respondents to ensure they meet the requirements of the sample. can the respondent remember?, Telescoping: A psychological Phenomenon that takes place when an individual telescope or compresses time by remembering and event as occurring more recently than it actually occurred. can the respondent articulate?
DEPT. OF IEM, DSCE, B'LORE-78 72

Questionnaires
Advantages: It is cost effective, impartial, respondents are offered enough time, large sample of questions can be used to make results reliable. Disadvantages: Possibilities of no-response, respondents have to be skilled and supportive, vague replies, not possible to identify right respondent, time consuming.

DEPT. OF IEM, DSCE, B'LORE-78

73

Form of a Questionnaire
Microsoft word software is not very difficult to use I am able to understand the contents of menus and tool bars

SA

SD

SA

SD

It is easy to understand and operate The software is very flexible It is very easy to discover the new features It is very pleasing to use the software

A A A A

SA SA SA SA

N N N N

D D D D

SD SD SD SD

DEPT. OF IEM, DSCE, B'LORE-78

74

Contents
Open ended questions do not require specific response. Eg. What is your opinion on current income tax policy? Close ended questions: Fill in the blanks, Dichotomous questions, Ranking scale questions, MCQs, Rating scale questions
DEPT. OF IEM, DSCE, B'LORE-78 75

Schedules
A schedule is a questionnaire for face to face interaction with the respondent. Objectives: Created for definite item of inquiry. Acts as an aid to memorize information being collected. Helps in tabulating and analyzing data. Types: Observation, rating, Document, Institution survey schedule, Interview schedule.

DEPT. OF IEM, DSCE, B'LORE-78

76

Merits & demerits of Schedules


Merits: Researcher is always there to help the respondent, avoids fake replies, personal contact with the researcher, better understand personality, living conditions and values, easy to detect and rectify defects. Demerits: costly and time consuming, require well trained and experienced field workers, respondent may not be able to tell facts, difficult to organize various activities.
DEPT. OF IEM, DSCE, B'LORE-78 77

Questionnaires & Schedules compared


Questionnaire is sent through mail and filled by respondent, schedule is filled by researcher himself. Data collection through questionnaire is cheaper when compared to schedules since later involves training to interviewers. Response in questionnaire is generally low but high in case of schedules. Identity of respondents is not clear in questionnaires. Questionnaire method is time consuming. Questionnaire does not allow personal contact with respondent. Questionnaire method is useful only in case respondent is literate. Risk of incomplete and incorrect information is more in questionnaires.
DEPT. OF IEM, DSCE, B'LORE-78 78

Other methods of data collection


Warranty cards/ business reply cards. Distributor or store audit. Pantry audits. Consumer panels: Transitory & Continuing. Use of mechanical devices: Eye cameras, Pupil metric cameras, Psycho galvanometer, Motion picture cameras. Projective techniques: Word association test, sentence completion test, story completion test, verbal projection test, play technique, quizzes, tests and examinations, sociometry. Depth interview. Content analysis.
DEPT. OF IEM, DSCE, B'LORE-78 79

Collection of Secondary data


Sources:
Publications of central, state, local & foreign governments, Technical & trade journals, books, magazines and news papers, research reports, public records, statistics and historical documents.

Characteristics:
Reliability, Suitability and adequacy of data.

Selection of appropriate methods for data collection:


Nature scope and object of inquiry, Availability of funds, Time factor, Precision required.
DEPT. OF IEM, DSCE, B'LORE-78 80

Case study method


Most familiar method for collecting secondary data. Qualitative analysis. Thorough & complete examination of a social unit. In-depth study of a particular unit. Objective is to determine the factors that are responsible for behavior patterns of the given unit in totality. Social Microscope.

Major Phases:
Identification of status of the phenomenon, accumulation of data, investigating the history, analysis and recognition of informal factors, application of corrective measures, review of programs.
DEPT. OF IEM, DSCE, B'LORE-78 81

Merits and demerits of case study


Merits:
Complete behavior pattern of concerned unit is understood; Case study deepens our perception and gives a clear insight into life; researcher can obtain genuine and progressive record of personal experiences; Natural history is determined; Helps in framing hypothesis; helps in gaining in-depth knowledge of a particular subject; increases experience of researcher; enables the researcher to observe social changes; maintains continuity of research process, helps in decision making. Situations of case study are incomparable; chances of false generalizations; takes lot of time and money; based on certain assumptions; applied to limited geographical area.

Demerits:

DEPT. OF IEM, DSCE, B'LORE-78

82

Report writing & presentation


Informational report:
Interpretive report:
and suggestions Inspection reports, Inventory reports, Assessment reports and performance reports. Inferences, recommendations

Characteristics of a Good report:


Language and Style. Structure. Presentation. References.
DEPT. OF IEM, DSCE, B'LORE-78 83

Report Writing & Presentation


Language and style
Questions/Hypothesis, analysis, Summary. Context and Research procedures,

style, Data

Structure: Sequence and order, appendix Presentation: Capitals, headings, Tables, figures and
equations, References: citations and quotes.

Mechanics of writing a report: Size and physical


design, Layout, Quotations, footnotes, Documentation style, Abbreviations, Use of statistics, charts and graphs, Final draft and Index
DEPT. OF IEM, DSCE, B'LORE-78 84

Format of a Research report


Preliminary pages: Main text: Introduction, methodology/design,
Analysis, Statement of Implications, Summary, recommendations. findings, Results, Suggestions &

Written v/s oral report. Principles of writing a report: Purpose,


organization, Brevity, clarity, scheduling, cost.
DEPT. OF IEM, DSCE, B'LORE-78 85

DEPT. OF IEM, DSCE, B'LORE-78

86

You might also like