Professional Documents
Culture Documents
The scientist has no other method than doing his damnedest.P.W. Bridgman, Reflections of a Physicist. All progress is born of enquiry. Doubt is often better than over confidence, for it leads to inquiry and inquiry leads to invention.
DEPT. OF IEM, DSCE, B'LORE-78 1
Plan of presentation
1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. Meaning of research. Objectives of research. Motivation in research. Types of research. Research approaches. Significance of research. Research methods V/s research methodology. Research and scientific method. Importance of knowing how research is done. Research process. Criteria for good research. Problems encountered by researchers in India.
DEPT. OF IEM, DSCE, B'LORE-78 2
Meaning of research.
1. 2. 3. 4. 5. 6. A careful investigation or inquiry specially through search for new facts in any branch of knowledge. Redman and Mory define research as a systematized effort to gain new knowledge. Research is an original contribution to existing stock of knowledge. Inquisitiveness is the mother of knowledge. Pursuit of truth with help of study, observation, comparison and experiment. Systematic approach concerning generalization and the formulation of the theory is also research.
DEPT. OF IEM, DSCE, B'LORE-78 3
Objectives of research.
1. To gain familiarity in to phenomenon or to achieve new insights into it.( Exploratory or formulative research) 2. To portray accurately the characteristics of a particular individual, situation or a group (Descriptive research) 3. To determine the frequency with which something occurs or with which it is associated with something else.(Diagnostic) 4. To test a hypotheses of causal relationships between variables.
DEPT. OF IEM, DSCE, B'LORE-78 4
Motivation in research
1. Desire to get a research degree along with its consequential benefits. 2. Desire to face the challenge in solving unsolved problems, concern over practical problems initiates research. 3. Desire to get intellectual joy of doing some creative work. 4. Desire to be of service to society. 5. Desire to get respectability.
DEPT. OF IEM, DSCE, B'LORE-78 5
Types of research.
1. Descriptive (Ex post facto research)V/s Analytical. 2. Applied V/s Fundamental.(Basic) 3. Quantitative Vs Qualitative. 4. Conceptual V/s Empirical. 5. Other types: Field/Lab research, Clinical/diagnostic research, Exploratory, conclusive, Historical research.
DEPT. OF IEM, DSCE, B'LORE-78 6
Research approaches.
Quantitative : Experimental, Simulation inferential Qualitative: Focus group interviews, projective techniques, depth interviews, Ethnographic, Phenomenological, field research.
Significance of research.
Research inculcates scientific and inductive thinking and it promotes the development of logical habits of thinking and organization. The role of research in several fields of applied economics, whether related to business or to the economy as a whole has greatly increased in modern times. Research provides the basis for all nearly all government policies in our economic system.
10
Scope of Research
Finance, budgeting & Investments. Purchasing, procurement and exploration. Production Management : Physical distribution, facility planning, Manufacturing planning. Research & development.
13
16
17
Research Planning
Designing a Research plan: Identifying the need for research. Selecting the research method. Collecting data. Analyzing the collected data Documenting the analyzed data.
DEPT. OF IEM, DSCE, B'LORE-78 18
Formulate Hypotheses
F
F F
Collect data
RESEARCH PROCESS
Analyze data
Research process
1. 2. 3. 4. 5. Formulating the research problem Extensive literature survey. Development of working Hypotheses. Preparing the research design. Determining sample design: Deliberate (convenience & judgmental sampling), simple random sampling, systematic sampling, stratified sampling, quota sampling, cluster sampling and area sampling, Multi stage sampling, sequential sampling. Collect the data: observation, personal interview, telephone interview, mailing of questionnaires, schedules. Analysis of data: Coding, editing & tabulation. Hypothesis testing. Generalizations and interpretations. Preparation of the report or the thesis.
DEPT. OF IEM, DSCE, B'LORE-78 20
6. 7. 8. 9. 10.
Nature of research.
Criteria for good research: Systematic Logical Empirical. Objectivity. Control. Universality Free from personal biases. Reproductivity/ Replicable.
DEPT. OF IEM, DSCE, B'LORE-78 21
9. 10.
23
1. 2. 3. 4.
Points to be noted:
Subject which is overdone should not be normally chosen, for it will be a difficult task to throw any new light in such a case. Controversial subject should not become the choice of an average researcher. Too narrow or too vague problems to be avoided. The subject selected for research should be familiar and feasible so that the related research material or sources of research are within ones reach. Importance of subject, qualifications and training of researcher, costs involved and time. Selection of the problem must be preceded by a preliminary study.
5. 6.
24
25
1. Statement of the problem in a general way: Preliminary/pilot survey. 2. Understanding the nature of the problem. 3. Surveying the available literature. 4. Developing the ideas through discussions: Experience survey. 5. Rephrasing the research problem: Working hypotheses.
DEPT. OF IEM, DSCE, B'LORE-78 26
Research design
A research design is the arrangement of conditions for collection and analysis of data in a manner that aims to combine relevance to the research purpose with economy in procedure.
27
Research design in case of descriptive and diagnostic research studies. (Survey design)
1. 2. 3. 4. 5. 6. Formulating the objectives of study: Designing the methods of data collection. Selecting the sample. Collecting the data. Processing and analyzing the data. Reporting the findings.
Type of Study Research Design Overall Design Sampling Design. Exploratory/Formulative Flexible design Non-Probability sampling. No-Preplanned Unstructured instruments for collecting data No fixed decisions Descriptive/Diagnostic Rigid design. Probability sampling
Statistical design.
Observational design. Operational design.
Pre-planned
Structured
Advanced decisions.
34
35
Treatment
introduced. Level of Phenomenon after treatment Y
37
Control area
38
Test area.
Treatment
introduced.
Control area
39
Completely randomized design Two-group simple randomized design Random replication design Latin square design. Factorial designs.
40
Research Plan
Research objectives. Problem statement. Operational definition of the concepts. Methods to be adopted. Details of techniques to be adopted. Population to be studies. Methods for processing data. Results of Pilot tests conducted.
DEPT. OF IEM, DSCE, B'LORE-78 41
42
43
Sampling errors
Sampling errors are random variation in sample estimates around the true population parameters. Sampling errors can be measured for a given sample design and size. The measurement of sampling error is usually called the precision of the sampling plan. While selecting a sampling procedure, the researcher must ensure that the procedure causes a relatively small sampling error and helps to control the systematic bias in a better way. DEPT. OF IEM, DSCE, B'LORE-78 47
Sampling designs
Element selection technique Representation basis Probability sampling Non-Probability sampling
Unrestricted sampling
Restricted sampling
49
50
2.
3. Interval scale: A scale in which numbers are used to rate objects such that numerically equal distances on the scale represent equal distances in the characteristics being measured. (Performance rating on 0-10 scale) 4.Ratio scale: The highest scale. It allows the researcher to identify or classify objects, rank order the objects, and compare intervals or differences . It is also meaningful to compute ratios of scale values. (Time to finish in seconds)
DEPT. OF IEM, DSCE, B'LORE-78 52
Scaling techniques
1. Comparative scale: There is a direct comparison of stimulus objects with one another. 2. Non comparative scales: Each stimulus object is scaled independently of the other objects in the stimulus set.
53
Rank order scaling: Respondents are presented with several objects simultaneously and asked to order or rank them according to some criterion. Constant sum scaling: respondent are required to allocate a constant sum of units among a set of stimulus objects with respect to some criterion. Q-sort scaling: uses a rank order procedure to sort objects based on similarity with respect to some criterion.
DEPT. OF IEM, DSCE, B'LORE-78 55
56
Non- comparative itemized rating scale decisions. 1. 2. 3. 4. 5. The number of scale categories to use. Balanced versus unbalanced scale. Odd or even number of categories. Forced versus unforced choices. The nature and degree of verbal descriptions. 6. The physical form or configuration of scale.
DEPT. OF IEM, DSCE, B'LORE-78 58
4. 5. 6. 7. 8. 9.
Scale evaluation
1. Reliability: Test/Retest, Alternative forms, Internal consistency. 2. Validity: Content, Criterion Construct: convergent, discriminant and Nomological. 3. Generalizability.
DEPT. OF IEM, DSCE, B'LORE-78 60
Reliability
Reliability: The extent to which a scale produces consistent results if repeated measurements are made on the characteristic. Test-retest: An approach for assessing reliability in which respondents are administered identical sets of scale items at two different times under as nearly equivalent conditions as possible. Alternative forms reliability: An approach for assessing reliability that requires to equivalent forms of the scale to be constructed and then the same respondents are measured at two different times.
DEPT. OF IEM, DSCE, B'LORE-78 61
Reliability
Internal consistency reliability: An approach for assessing the internal consistency of the set of items when several items are summated in order to form a total score for the scale. Spit-half reliability: A form of internal consistency reliability in which the items constituting the scale are divided into two halves and resulting half scores are correlated. Coefficient alpha: a measure of internal consistency reliability that is the average of all possible split-half coefficients resulting from different splitting of scale items.
DEPT. OF IEM, DSCE, B'LORE-78 62
Validity
Validity: The extent to which the differences in observed scale scores reflect true differences among objects on the characteristic being measured, rather than systematic or random errors. Content validity: A type of validity sometimes called face validity, that consists of a subjective but systematic evaluation of the representativeness of the content of a scale for the measuring task at hand. Criterion validity: A type of validity that examines whether the measurement scale performs as expected in relation to other variables selected as meaningful criteria.
DEPT. OF IEM, DSCE, B'LORE-78 63
Validity
Construct validity: A type of validity that addresses the question of what construct or characteristic the scale is measuring. An attempt is made to answer theoretical questions of why a scale works and what deductions can be made concerning the theory underlying the scale. Convergent validity: A measure of construct validity that measures the extent to which the scale correlates positively with other measures of the same construct.
DEPT. OF IEM, DSCE, B'LORE-78 64
Validity
Discriminant validity: A type of construct validity that assesses the extent to which a measure does not correlate with other constructs from which it is supposed to differ. Nomological validity: A type of validity that assesses the relationship between theretical constructs. It seeks to confirm significant correlations between the constructs as prescribed by theory.
DEPT. OF IEM, DSCE, B'LORE-78 65
Generalizability/Practicality
The degree to which a study based on sample applies to a universe of generalizations.
66
DATA COLLECTION
Primary Data & Secondary data Secondary data is a ready made data, useful for research topic of interest. Primary data is generated/collected by the researchers for the purpose of investigation. Collection of primary data: Observation method, Interview method, Questionnaires, Schedules, Survey methods etc.
67
OBSERVATION METHOD
What should be observed?, How should the observation be recorded?, how can the accuracy f observation be ensured? Types: Structured, Unstructured, participant, Non-participant, Disguised, Controlled, & Uncontrolled observation. Limitations: Expensive, provides limited information, affected by unwanted factors.
DEPT. OF IEM, DSCE, B'LORE-78 68
INTERVIEW METHOD
Presentation of oral & verbal stimuli and reply in terms of oral & verbal responses. Personal interview: involves two people. Structured: Eg. What is the main function of your production department? Unstructured: Eg. How would you evaluate the benefits of new machinery that is installed in your production department?
DEPT. OF IEM, DSCE, B'LORE-78 69
70
Individual question content: Is question necessary?, Are several questions needed instead of one? Double barreled questions: A single question that attempts to cover two issues. Such questions can be confusing to respondents and result in ambiguous responses. Overcoming inability to answer: is the respondent informed?, Filter question: An initial question in a questionnaire that screens potential respondents to ensure they meet the requirements of the sample. can the respondent remember?, Telescoping: A psychological Phenomenon that takes place when an individual telescope or compresses time by remembering and event as occurring more recently than it actually occurred. can the respondent articulate?
DEPT. OF IEM, DSCE, B'LORE-78 72
Questionnaires
Advantages: It is cost effective, impartial, respondents are offered enough time, large sample of questions can be used to make results reliable. Disadvantages: Possibilities of no-response, respondents have to be skilled and supportive, vague replies, not possible to identify right respondent, time consuming.
73
Form of a Questionnaire
Microsoft word software is not very difficult to use I am able to understand the contents of menus and tool bars
SA
SD
SA
SD
It is easy to understand and operate The software is very flexible It is very easy to discover the new features It is very pleasing to use the software
A A A A
SA SA SA SA
N N N N
D D D D
SD SD SD SD
74
Contents
Open ended questions do not require specific response. Eg. What is your opinion on current income tax policy? Close ended questions: Fill in the blanks, Dichotomous questions, Ranking scale questions, MCQs, Rating scale questions
DEPT. OF IEM, DSCE, B'LORE-78 75
Schedules
A schedule is a questionnaire for face to face interaction with the respondent. Objectives: Created for definite item of inquiry. Acts as an aid to memorize information being collected. Helps in tabulating and analyzing data. Types: Observation, rating, Document, Institution survey schedule, Interview schedule.
76
Characteristics:
Reliability, Suitability and adequacy of data.
Major Phases:
Identification of status of the phenomenon, accumulation of data, investigating the history, analysis and recognition of informal factors, application of corrective measures, review of programs.
DEPT. OF IEM, DSCE, B'LORE-78 81
Demerits:
82
style, Data
Structure: Sequence and order, appendix Presentation: Capitals, headings, Tables, figures and
equations, References: citations and quotes.
86