You are on page 1of 3

RESEARCH INSTRUMENT  Collection of items developed to

- Mechanism for measuring phenomena measure some human educational


- Used to gather and record info for assessment or psychological attribute
decision making and ultimately understanding o Rating scale
- Anything that becomes a means of collecting data  Evaluate
 Make us of an item format where
SUBJECTIVE response choices are ordered on a
- Info that originates within an individual continuum
- Reflected by items that measure attitudes, feelings,  Used to measure attitudes and
opinions, values and beliefs opinions and record direct
OBJECTIVE observation and assessment
- Attempts to be free of personal interpretation  Assess the performance of
- Typified by data that are observable individuals, orgs, programs and
- Demographic info services
- Performance and behavior rating
 The choice and development of the research o Use rating scales specifically designed to
instrument to achieve the research measure an individual’s ability to complete
a task or perform an act
Objectives must be done with: o Typically completed by an external observer
1. Operationalization of concepts used in the study or rater
2. Development of the strategy of the data collected o May designed to be descriptive (ex:
present/not present_
Note: o Designed to evaluate (ex: satisfactory/not)
- Several diff res instrument can be used to achieve - Checklists
the same res objectives o Determine the presence or absence of an
attribute and to count the prevalence of an
When to choose res instrument: item event
- Choosing the res instrument is done for o May use variety of formats: scales, rank,
conceptualization and the choice of units of analysis order, dichotomous choices and open-
- The relationship between types of analysis and res ended questions
instrument is determined by the questions and how - Inventory
the researcher will determine the answer o List of objects, goods or attributes
o A word used to describe the process of
CATEGORIZATION: INSTRUMENT compiling the list
- Mode of administration: o Used to asses a person’s interests,
o Who is responsible for completing the characteristics, or skills
instrument? o Developmental inventories – items are
o Self-report usually listed sequentially where acquisition
 Respondents supplies the info of basic skills is a precursor to
directly demonstrating more complex skills
o Info obtained can be weighed against
o Observation normative data to provide a comparative
 Instrument Is being completed by a level of functioning
third person - Survey, poll, attitude scale and questionnaire
 Collects data about characteristics o Terms used to interchangeably to describe
intrinsic to an individual (ex: instruments designed to obtain factual
checklist) information and to assess beliefs, opinions
 External rater is also used when and attitudes
info is needed about things rather o Can be used to:
than people  Explore relation
- Use or purpose  Examine attitudes and beliefs
o Provides nomenclature for describing - Psychometric instrument
instruments, however, it is not easy to o Describe an array of instrument designed to
make distinctions based solely on intended assess cognitive, affective and physical
use functioning and personality traits
o Test o Assess vocational abilities and aptitude to
predict an individual’s suitability for an
occupation
o Ex: standardized test (IQ test) performance, the information can be used by other
o Subcategory of this are designed for researchers to compare the results of their data with
behavioral analysis similar groups under similar circumstances.

CONSIDERATIONS IN THE SELECTION OR CONSTRUCTION OF SELECTING AN APPROPRIATE INSTRU


INSTRUMENTS - Purpose of the study
1. Validity, reliability, and practicality issues o The type of info is needed
2. Availability of adequate research instruments - Research design
3. Ability and time of the researcher to develop good - Object of measurement
instruments o Who or what will be the focuses of inquiry
4. Purposes and circumstances of the research o “What” – develop an instrument that will
5. Philosophy of the researcher be competed by an independent or external
rater
COMPONENTS OF AN INSTRUMENT o “Who” – self-report
1. Title - Data collection methodology
o Helps to convey o Type of sate that might be produced and
2. Introduction how those data will be collected
o Why instrument was made o Info can help determine the item format to
o What does the instrument measure? use
o Contain how the instrument and its - Resources
contents will be managed o Availability of the resources
3. Directions or instruction
o Should be given at the beginning of the Rules governing the selection instruments:
instrument and within the instrument - The highest validity
where the respondents need guidance to - The highest reliability (70—100%)
complete - The greatest ease of administration, scoring, and
o Brief and clear interpretation
4. Items - Test takers’ lack of familiarity with instrument
o Heart of the instrument - Avoids potentially controversial matters
o Could be a multitude of items formats
o Selection items provide the potential Administering the instrument:
choices from which the respondent makes a - Make arrangements in advance
choice - Ensure ideal testing environment
o Rating item consists of a tem, which is a - Be prepared for all probable contingencies
phrase, sentence or questions that elicits
info, and the response set, form which the TWO ISSUES IN USING INSTRUMENTS:
respondent makes the selection 1. Validity – the degree to which the instrument
o Response choices may form a graded measures what it purports to measure
continuum, or scale or they may be 2. Reliability- the degree to which the instrument
alternatives consistently measures what it purports to measure
o Supply items (eg: open-ended questions),
require respondents to provide the answer VALIDITY
themselves - Defined as the property of an instrument that refers
5. Demographics to:
o Info: age, gender, occupation, and marital o The ability of the instrument to measure
status what it intends to measure
o Should solicit only info that is vital to the o The extent
study can be placed at either at the o Types:
beginning or end  Content validity – the degree to
6. Closing section which an instrument measures an
a. Thank the research respondent for taking intended content area
the study  Criterion-related validity- an
individual rakes two forms of an
CONSIDERATIONS IN EVALUATING EXISTING INSTRUMENTS instrument which are ten
1. Format considerations correlated to discriminate between
2. Content considerations those individuals
3. Standardization – process of testing the instrument  Concurrent validity –
on many different samples of people describing their scores on one test
correlate to score on Interpretation of results could be done through:
another test when both 1. Norm-referenced
tests are administered in o Provides an indication about how one
the same time frame individual performed in an instrument
 Predictive validity – a test compared to the other students performing
can predict how well on the same instrument
individual will do in a 2. Criterion-referenced
future situation o Involves a comparison against
 Construct validity – a series of predetermined levels of performance
studies validate that the 3. Self-referenced
instrument really measures what it o Involves measuring how an individual’s
purports to measure performance changes over time

TYPES OF RELIABILTY: DECISIONS ABOUT RESEARCH INSTRUMENT


1. Stability (“test-retest”) – the degree to which two - Decisions are made by the researcher and are based
scores on the same instrument are consistent over on the research question and design of study
time - Research instrument for an investigation need to
2. Equivalence (“equivalent forms”) – identical accurately measure the phenomena under study
instruments (except for the actual items included)
yield identical scores WRITING YOUR RESEARCH INSTRUMENT IN PROPOSAL
3. Internal consistency (“split-half, whole test, inter- 1. Name of the tool, author, copyright
rater Cohen Alpha”) – reliability with Spearman- 2. What does it measure, domains
Brown correlation formula, Kuder-Richardson and 3. Number of items, response procedure
Cronback’s Alpha reliabilities, scorer/rater 4. Administration procedure
reliability): the degree to which one instrument 5. Scoring procedure
yields consistent results 6. Reliability index

PRACTICALITY Note: written in a PARAGRAPH FORM!!!!!! (3-5 sentences &


- Relate to the usefulness of the instrument for the avoid long sentences & use simple statements)
researchers intended purpose
- Often viewed in terms of utility and economy with
respect to time, money and effort
- Refers to the characteristics of administrability,
scorabilty and etc.

FACTORS THE DETERMINE PRACTICALITY OF A RES


INSTRUMENT
- Ease of administration
- Ease of scoring/coding and decoding of results
- Should be simple and direct
- Should be sufficient but not too many
- Be clear
- Appropriateness

Measurement scales
- The representation of variables so that they can be
quantified
o Qualitative (categorical)
 nominal
o Quantitative
 Ordinal (ranks them 1st 2nd 3rd)
 Interval (according to equal
differences with no true zero
point)
 Ratio (according to equal
differences with a true zero point)

You might also like