Introduction

It is important to assess students’ learning not only through their outputs or products but also the processes which the students underwent in order to arrive at these products or outputs. Learning entails not only what students know but what they can do with what they know. It involves knowledge, abilities, values, attitudes and habits of mind that affect academic success and performance beyond the classroom.

Assessing Student Learning
In order to accomplish the desired goals of learning, teachers first need to assess the performance of their students so that they know exactly where their students know and what they don’t. Assessment results are used as a yardstick for gauging a student’s performance and progress. Assessing student learning helps to indicate the level of proficiency and competency in a student. In short, assessment is collection of information and data in order to improve and enhance the experience of learning.

What Does ‘Assessing Student Learning’ Mean?
It is involves compiling and comprehending the information collected regarding a students’ performance and progress, with a view to promote learning effectively and efficiently. It is a comprehensive process including several steps like planning, measuring, diagnosing, evaluating and interpreting the results. In order to reach reliable and accurate conclusions, it is recommended to use several methods of assessment. Keeping assessments in tune with the necessary required standards is always crucial.

How Is Student Learning Assessed?
The performance of students can be evaluated and determined through paper and pencil tests, performance based assessments and portfolios. Of these, the paper and pencil test is the most widely used form of assessing student learning. It is also popularly known as multiple choices. But, gradually performance assessments are also gaining momentum. More attention is being paid to performance assessment for assessing student learning, since they can measure higher levels of knowledge and skills. The third method of assessing student learning-portfolios, are used to measure a student’s performance over a longer period of time. As the name suggests, a portfolio assessment comprises an assortment of the student’s progress and academic results.

What Criteria Should The Assessments Meet?
Reliability, fairness, validity and bias are the basic fundamentals of a sound assessment. In order to promote learning, an assessment should be able to measure the results accurately. The assessments should be valid in terms of their content, construction and criterion. It should yield reliable information and results pertaining to student’s performance. Care should be taken to ensure that no unfair assessment practices are implemented on the basis of gender, ethnic origin, status or religion.

Description of the Authentic Assessment Tools

What is Performance Assessment?
 One in which a teacher observes and makes a judgment about the student’s demonstration of a skill or competency in creating a product, constructing a response, or making a presentation.  Emphasis on student’s ability to perform tasks by producing their own work with their knowledge and skills. Examples: singing, playing a piano, performing gymnastics or completed paper, project

Characteristics of Performance Assessment
• • • • • • • • Students perform, create, construct, produce, or do something Deep understanding and/or reasoning skills are needed and assessed Involves sustained work, often days Calls on students to explain, justify and defend Involves engaging ideas of importance and substance Relies on trained assessor’s judgments for scoring Multiple criteria and standards are prespecified No single “correct” answer

Strengths & Weaknesses of Performance Assessments

Strengths

Weaknesses

      

Integrates assessment with instruction Learning occurs during assessment Provides opportunity for formative assessment More authentic More engaging, active involvement of students Emphasis on reasoning skills

 Reliability may be difficult to establish  Measurement error due to subjective nature of the scoring  Inconsistent student performance across time may result in inaccurate conclusions  Requires considerable teacher time to prepare and student time to complete

 Difficult to plan for amount of time needed
Teachers establish criteria to identify successful performance Emphasis on application of knowledge

 Encourages student self-assessment

SEMANTIC DIFFERENTIAL
Semantic differential is a type of a rating scale designed or invented by C.E. Osgood (1957) to measure the connotative meaning of objects, events, and concepts or attitudes. It has been used for a variety of purposes ranging from predicting a political election to identifying changes in personality structure. Charles Osgood's semantic differential was designed to measure the connotative meaning of concepts. The respondent is asked to choose where his or her position lies, on a scale between two bipolar words, or a range of words or numbers ranging across a bipolar position (for example, `Excellent', `Good', Adequate', `Poor', `Inadequate'; or from 5 (powerful) down to 1 (weak). Semantic differentials can be used to describe not only persons, but also the connotative meaning of abstract concepts—a capacity used extensively in affect control theory. The Semantic Differential (SD) measures people's reactions to stimulus words and concepts in terms of ratings on bipolar scales defined with contrasting adjectives at each end. An example of an SD scale is:

Usually, the position marked 0 is labeled "neutral," the 1 positions are labeled "slightly," the 2 positions "quite," and the 3 positions "extremely." A scale like this one measures directionality of a reaction (e.g., good versus bad) and also intensity (slight through extreme). Typically, a person is presented with some concept of interest, e.g., Red China, and asked to rate it on a number of such scales. Ratings are combined in various ways to describe and analyze the person's feelings.

A number of basic considerations are involved in SD methodology: (1) Bipolar adjective scales are a simple, economical means for obtaining data on people's reactions. With adaptations, such scales can be used with adults or children, persons from all walks of life, and persons from any culture. (2) Ratings on bipolar adjective scales tend to be correlated, and three basic dimensions of response account for most of the co-variation in ratings. The three dimensions, which have been labeled Evaluation, Potency, and Activity (EPA), have been verified and replicated in an impressive variety of studies. (3) Some adjective scales are almost pure measures of the EPA dimensions; for example, good-bad for Evaluation, powerful-powerless for Potency, and fast-slow for Activity. Using a few pure scales of this sort, one can obtain, with considerable economy, reliable measures of a person's overall response to something. Typically, a concept is rated on several pure scales associated with a single dimension, and the results are averaged to provide a single factor score for each dimension. Measurements of a concept on the EPA dimensions are referred to as the concept's profile.

(4) EPA measurements are appropriate when one is interested in affective responses. The EPA system is notable for being a multi-variate approach to affect measurement. It is also a generalized approach, applicable to any concept or stimulus, and thus it permits comparisons of affective reactions on widely disparate things. EPA ratings have been obtained for hundreds of word concepts, for stories and poems, for social roles and stereotypes, for colors, sounds, shapes, and for individual persons. (5) The SD has been used as a measure of attitude in a wide variety of projects. Osgood, et al., (1957) report exploratory studies in which the SD was used to assess attitude change as a result of mass media programs and as a result of messages structured in different ways . Their chapter on attitude balance or congruity theory also presents significant applications of the SD to attitude measurement. The SD has been used by other investigators to study attitude formation (e.g., Barclay arid Thumin, 1963), attitudes toward organizations (e.g., Rodefeld, 1967), attitudes toward jobs and occupations (e.g., Triandis, 1959; Beardslee and O'Dowd, 1961; Gusfield and Schwartz, 1963), and attitudes toward minorities (e.g., Prothro and Keehn, 1957; Williams, 1964; 1966). The results in these, and many other studies, support the validity of the SD as a technique for attitude measurement. The question of validity, and other issues in assessing attitudes with the SD, will be treated in more detail after a general discussion of SD theory and technique.

CHECKLIST
The most common and perhaps the easiest instrument in the affective domain to construct is the checklist. A checklist consist of simple items that the students or teacher marks “absent” or “present”. Here are the steps in the construction of a checklist: 1) Enumerate all the attribute and characteristics you wish to observe relative to the concept being measured. For instance, if the concept is “interpersonal relation”, then you might want to identify those indicators or attributes which constitute an evidence of good interpersonal relation. 2) Arrange these attributes as a “shopping” list of characteristics. 3) Ask the students to mark those attributes or characteristics which are present and to leave blank those which are not.

PROCESS-ORIENTED PERFORMANCE-BASED ASSESSMENT
Information about outcomes is important. To improve outcomes, we need to know about student experience along the way - about the curricula, teaching, and kind of students that lead to particular outcomes. Assessment can help us understand which students learn best under what conditions; which such knowledge comes the capacity to improve the whole of their learning. Process-oriented performance-based assessment is concerned with the actual task performance rather than the output or product of the activity. The learning objectives in process oriented performance based assessment are stated in directly observable behaviors (Learning Competencies). These learning competencies should start from a general statement, and then breaks down to easily observable behaviors. Example: Task: Recite a Poem by Edgar Allan Poe Objectives: The activity aims to enable the students to recite a poem entitled “The Raven” by Edgar Allan Poe 1. Recite the poem from memory without referring to notes 2. Use appropriate hand and body gestures in delivering the piece 3. Maintain eye contact with the audience while reciting the poem 4. Create the ambiance of the poem through appropriate rising and falling intonation 5. Pronounce the words clearly and with proper diction

Simple Competencies Speak with a well-modulated voice

Complex Competencies Recite a poem with a feeling using appropriate voice quality, facial expressions and hand gestures Construct an equilateral triangle given three non-collinear points Draw and color a leaf with a green crayon

Draw a straight line from one point to another

Color a leaf with a green crayon

Task Designing (Why and How)
How to design tasks?
1. Identify the activity that would highlight the competencies to be evaluated (reciting a poem, writing an essay, manipulating a microscope) 2. Identify an activity that entails more or less the same set of competencies 3. Finding interesting and enjoyable tasks (writing an essay is boring) Example: The topic is on Understanding biological diversity Possible Task Design: Bring the students to a pond or creek and ask them to find all living organisms as they can find. Bring them to a school playground too. How to assess: Observe how the students will develop a system on finding organisms, classifying and concluding the differences between the bio diversity of the two sites.

Proper Assessment Tool (Scoring Rubrics)
Rubric – a scoring scale used to assess student performance along a task-specific set of criteria. Example: Recitation Rubric

Criteria

Weight

Level of Performance

Number of Appropriate Hand Gestures Approapriate Facial Expression

X1

1-4**

5-9**

10-12**

X1

Lots of Appropriat e facial expression **

Few inappropriate facial expression**

No inappropriate facial expression **

Voice Inflection

X2

Monotone voice used**

Can vary inflection with difficulty**

Can easily vary voice inflection**

Incorporate proper ambience through feelings in the voice

X3

Recitation contains little feelings**

Recitation has some feelings**

Recitation fully captures ambience through feelings in the voice**

Parts of a Scoring Rubric
1. Criteria – characteristics of a good performance task (left hand column) (written in shorthand to fit the table, e.g. “Number of appropriate hand gestures” in full criteria would be “Includes a sufficient number of hand gestures.” 2. Level of Performance – Degree the students have met the criterion Descriptors ** - spell out what is expected of students at each level of performance for each criterion. (lots of inappropriate…., monotone voice…) - Tells the student what a performance looks like at each level . - Helps distinguish student work. 3. Weight–mechanism for assigning scores to each project

WHY INCLUDE LEVELS OF PERFORMANCE?
Clearer expectations– Student know what is expected on them and teachers know what to look for in student’s performance More consistent and objective assessment–teachers objectively distinguish between a good and a bad performance Better Feedback– allows teacher’s to provide better feedback to students.

Types of Rubric: ANALYTIC VS. HOLISTIC
1. Analytic Rubric – articulates levels of performance for each criterion so the teacher can assess student performance on each criterion 2. Holistic Rubric – Does not list separate levels of performance for each criterion. Instead, a holistic rubric assigns level of performance by assessing performance across multiple criteria as a whole. A more global picture of the student’s performance in the entire task.

3 – Excellent Speaker Includes 10-12 changes in hand gestures No apparent inappropriate facial expressions Utilizes proper voice inflection Can create proper ambiance for the poem

2 – Good Speaker Includes 5-9 changes in hand gestures Few inappropriate facial expressions Have some inappropriate voice inflection changes Almost creating proper ambiance

1 – poor Speaker Includes 1-4 changes in hand gestures Lots of inappropriate facial expressions Uses monotone voice Cannot create proper ambience

WHEN TO USE A RUBRIC?

-Analytic rubric is more common and assesses tasks that involve a larger number of criteria. -Analytic rubric better handles weight on criteria.

- Holistic rubric are used when a quick judgment need to be made.

- Holistic rubrics are used for judging MINOR assessment.

HOW MANY LEVELS OF PERFORMANCE SHOULD I INCLUDE?
There is no specific number if levels a rubric should not possess. Start small then expand. Example:

Makes eye contact with the audience

never

sometimes

always

Makes eye contact with the audience

never

rarely

sometimes

usually

always

Makes eye contact with the audience

never

rarely

sometimes

usually

RECOMMENDATIONS: Fewer levels of performance should be included because: It’s easier and quicker to administer Easier to explain to students Easier to expand.

PRODUCT-ORIENTED PERFORMANCE-BASED ASSESSMENT
Performance based tasks require performance-based assessment in which the actual student performance is being assessed through a PRODUCT, that demonstrates levels of task achievement. Student Performance – can be defined as targeted tasks that lead to a product or overall learning outcome Products – may include a wide range of student work that target specific skills. Rubrics – one way to evaluate student performance in any given task as it relates the final product or learning outcomes.

When to use Product Oriented Performance Based Assessment?
When the product of the activity is more important than the performance of the student in the process of learning.

Difference of Process oriented rubric with product oriented rubric – Product oriented rubrics are linked with an assessment of the level of “expertise” manifested by the product.( Novice/Beginner, Skilled, Expert Levels) DEFINING LEARNING COMPETENCIES FOR PRODUCTS/OUTPUT Level 1 –Does the finished product or the project illustrates the minimum expected parts or function (Beginner)

Level 2 – Does the finished product or project and contains additional parts and functions on top of the minimum requirements which tend to enhance the final output (Skilled)

Level 3 – Does the finished product contain the basic minimum parts and function, have additional features on top of the minimum, and aesthetically pleasing (Expert)

EXAMPLE: The product desired is a scrapbook illustrating the historical events called EDSA I People Power Learning Competencies: The scrapbook presented by the students must: 1. Contains pictures, newspaper clippings and other illustrations for the main characters of EDSA I (MINIMUM SPECIFICATION)

2. Contain remarks and captions for the illustrations made by the student himself for the roles played by the characters in EDSA 1 People Power (SKILLED) 3. Be presentable, complete, informative, and pleasing to the reader of the scrapbook. EXPERT Performance based assessment for products and projects can also be used for assessing outputs of SHORT TERM TASKS. -> Example: The desired output consists of output in a typing class. Learning Competencies: The final typing outputs of the students must: 1. Possess no more than five (5) errors in spelling (MINIMUM) 2. Possess no more than five (5) errors in spelling while observing proper format based on the document to be typewritten (SKILLED) 3. Possess no more than 5 errors in spelling, has the proper format, and is readable and presentable – (EXPERT) EVIDENCE – BASED – Product oriented performance based learning competencies need concrete evidence that a student has achieved a certain level of competence based on product.

TASK DESIGNING (How to design tasks on POPBA?)
Concepts that may be associated with task designing include: 1. Complexity – needs to be within range of the ability of the students. Too simple are uninteresting, too complicated are frustrating. 2. Appeal – Projects should be interesting enough so that students are encouraged to pursue to complete the task. 3. Creativity–Think out of the box (divergent thinking). The project should lead to exploring various possible ways of presenting the output. 4. Goal Based – Bear in mind that the project is produced in order to attain a learning objective. Projects are assigned not just for the sake of producing something but reinforcing learning. Exercise: Design a project or task for each of the following learning objectives. 1. Analyze the events leading to Rizal’s martyrdom. 2. Illustrate the concept of “diffusion” 3. Illustrate the cultural diversity in the Philippines

Scoring Rubrics - These are descriptive scoring schemes that are developed by teachers to
guide the analysis of the products or processes of students’ efforts.

Criteria Setting
Criteria are statements which identify “what really counts” in the final output. Example:  Quality  Creativity  Comprehensiveness  Accuracy  Aesthetics Identify substatements that would make the major criteria more focused and objective. Example: Essay on “The Three Hundred Years of Spanish Rules in the Philippines” Quality  Interrelates the chronological events in an interesting manner  Identifies the key players in each period of the Spanish rule and the roles that they played  Succeeds in relating the history of Philippine Spanish rule

1 Title The title does not reflect what the data show or the title is missing.

2 The graph contains the title that generally tells what the data show Some parts of the graph are inaccurately labeled Data representation contains minor errors

3 The graph contains a title that clearly tells what the data show All parts of the graph are correctly labeled

Weight 10%

Labels

Only some parts of the graph are correctly labeled or labels are missing The data are inaccurately represented, contain major errors, or are missing The graph is sloppy and difficult to read

20%

Accuracy

All data are accurately represented on the graph

50%

Neatness

The graph is generally neat and readable

The graph is very neat and easy to read

20%

Analytic Rubric for Graphic Display of Data

 Organization of document is difficult to follow due to a combination of the following: 1. Inadequate transitions 2. Rambling format 3. Insufficient or irrelevant information 4. Ambiguous graphics  The document contains numerous distractions that appear in the combination of the following forms: 1. Flow in thought 2. Graphical presentation 3. Grammar/mechanics  There appears to be no organization of the document’s contents  Sentences are difficult to read and understand

When are scoring rubrics an appropriate evaluation technique?
 Essay  Evaluate group activities  Oral presentations Where and when a scoring rubric is used does not depend on the grade level or subject, but rather on the purpose of the assessment

Other Methods
 Checklists are appropriate for evaluation when the information that is sought is limited to the determination of whether specific criteria have been met.  Scoring rubrics are based on descriptive scales and support the evaluation of the extent to which criteria have been met.  If the purpose of assessment have been met

Benefits of scoring rubrics: 1. They support the examination of the extent to which the specified criteria have been reached. 2. They provide feedback to students concerning how to improve their performances

Process of Developing Scoring Rubrics
Steps 1. Identify the qualities and attributes that you wish to observe in the students’ outputs that would demonstrate their level of proficiency 2. Decide whether a holistic or analytical rubric would be appropriate In analytic scoring rubric, each criteria is considered one by one and the descriptions of the scoring levels are made separately while in holistic rubric, the collection of criteria is considered throughout the construction of each level of the scoring rubric and the result is a single descriptive scoring schemes. 3. Identify and define the criteria for the top level and lowest level of performance 4. Create additional categories such as average, etc. Each score category should be defined using descriptors of the work rather than value-judgment about the work Example: “Student’s sentences contain no errors in subject-verb agreements”, is preferable than “student’s sentences are good” 5. Test whether scoring rubric is reliable. Ask two or more teachers to score the same set of projects or outputs and correlate their individual assessments Exercise For each of the following, develop a scoring rubrics: a. Evaluating performance in argumentation and debate b. Laboratory output in “Frog dissection” c. Oral presentation on the piece “Land Bondage, Land of the Free” d. Essay on “Should the power industry be deregulated?” e. Group activity on “Geometric shapes through paper folding

Guidelines for Stating Performance Criteria
1. Identify the steps or features of the performance or task to be assessed imagining yourself performing it, observing students performing it or inspecting finished products. 2. List the important criteria of the performance or product. 3. Try to keep the performance criteria few so that they can be reasonably observed and judged. 4. Have teachers think through the criteria as a group. 5. Express the criteria in terms of observable student behavior or product characteristics. 6. Avoid vague and ambiguous words like correctly, appropriately, and good. 7. Arrange the performance assessment instruments to use or modify them before constructing them.

Scoring Rubric for Response Journal Questions
3 – Excellent. Answers are very complete and accurate. Most answers are supported with specific information from the reading, including direct quotations Sentence structure is varied and detailed Mechanics are accurate, including spelling, use of capitals, and appropriate punctuation. 2 – Good. Answers are usually complete and accurate. These answers are supported with specific information from the reading. Sentence structure is varied. Mechanics are generally accurate including spelling, use of capitals, and appropriate punctuation. 1 – Needs Improvement. Answers are inaccurate. These answers need to be supported with specific information. Sentence structure is incomplete. Mechanics need significant improvement.

The Assessment Tools

SEMANTIC DIFFERENTIAL SCALE OF HESTER PRYNNE’S CHARACTERISTICS IN THE NOVEL ‘SCARLET LETTER’ BY NATHANIEL HOWTHORNE
Direction: Rate the characteristics of Hester Prynne by putting a check mark on one of the 7 spaces along each dimension.

3 Honest Kind Strong Fair Wise Helpful Reliable Loving Humble Beautiful Good

2

1

0

1

2

3 Deceitful Harsh Weak Biased Fool Unhelpful Unreliable Unloving Arrogant Ugly Bad

CHECKLIST FOR A GOOD INTERVIEW

Part
• Set goal for the interview:

Checklist
o Understand new product development process and marketing strategies. • Gather as much information as possible about the new product development process and marketing strategies. • Search the literature (Library). • Search the World Wide Web for information. • Develop questions about the new product development process as a starting point for gathering information.

Preparing
• What is the new product development process? • What are some concerns in the new product development process? • How do you manage new product development to be successful? Any guidelines? • What is the process of marketing a new product? • What are the strategies for the successful marketing of a new product? • Arrange the meeting place (or confirm reservation). • Visit the meeting place (at least one time). • Be on time for the meeting. • Introduce myself and let the manager know who I am, the topic under investigation, and the purpose of the interview.

Beginning

• Start seriously but with an easy question: "How long have you worked in product development?" • Be professional and don't joke or try to be funny unless I am sure it will be well received. • Take notes during the interview. • Think critically about the information that the manager provided during the interview and develop further questions.

• Reflect on and paraphrase the information given to me during the Conducting interview from the manager. • Take time to make sure I understand what the manager says. • Provide reassurance that the information being provided to me is useful and informative.

• Respond to the manager's answers with integrity. • Be honest when responding to the manager's questions. • Offer my opinion only when asked. • Be positive in my demeanor. • Give the manager a chance to ask any questions I can answer. • Summarize the main points of the interview.

Concluding • If possible, ask for an opportunity to ask additional questions at a later
time. • Thank the manager for her time.

PROCESS-ORIENTED PERFORMANCE-BASED ASSESSMENT RUBRIC IN ASSESSING A POETRY INTERPRETATION

Weight

Exemplary (4 pts) The student has memorized the entire poem and is able to present it without error.

Mastery

30%

Learned (3 pts) The student has memorized entire poem and is able to present with just one error from which he/she recovers. Enthusiastic. Clear throughout. Audibility dropped a couple of times.

Basic (2pts) The student has memorized entire poem and is able to present, however makes three or more errors and doesn't recover.

Apprentice (1 pt) The student has not memorized the poem.

Vocal Expression (Expressiveness, Clarity and Audibility)

25%

Energizes the audience with enthusiasm. Precisely clear. Audible throughout. Involves audience with eye contact and used very fitting bodily movements and hand gestures. The student showed very appropriate emotions and facial expressions in conveying the poem’s theme and message. Used props and costume that are very appropriate to the poem’s theme.

Some variation in voice expressiveness. Generally clear. Generally audible.

Voice expression is monotonous. Poor articulation. Cannot be heard.

Physical Expression (Eye contact, bodily movement, )

20%

Much eye contact and used fitting bodily movements and hand gestures.

Some eye Lacks eye contact and contact. used not so Stiff or erratic fitting bodily and used movements and unfitting bodily hand gestures. movements and hand gestures.

Emotional Expression

20%

The student showed appropriate emotions and facial expressions in conveying the poem’s theme and message.

The student showed not so appropriate emotions and facial expressions in conveying the poem’s theme and message.

The student showed inappropriate emotions and facial expressions in conveying the poem’s theme and message.

Props/Costume

5%

Used props and costume that are appropriate to the poem’s theme.

Used props and costume that are not so appropriate to the poem’s theme.

Props and costume used is not appropriate to the poem’s theme.

RODUCT-ORIENTED PERFORMANCE-BASED ASSESSMENT RUBRIC IN ASSESSING A BROCHURE

Weight Exemplary (4 pts) The brochure has exceptionally attractive formatting and wellorganized information. The brochure includes all required elements; all entries are relevant to the chosen topic

Attractivenes s& Organization

25%

Learned (3pts) The brochure has attractive formatting and wellorganized information.

Basic (2pts) The brochure has wellorganized information.

Content

30%

Writing Mechanics

25%

The brochure has all of the required elements; some of the entries are relevant to the chosen topic. All of the Most of the writing is writing is done in done in complete complete sentences. sentences. Capitalization Most of the and capitalization punctuation and are correct punctuation throughout are correct the brochure. throughout the brochure.

The brochure has most of the required elements; only few entries are relevant to the chosen topic Some of the writing is done in complete sentences. Some of the capitalization and punctuation are correct throughout the brochure.

Graphics

10%

All of the graphics go well with the text. There are many citations from a variety of sources accurately listed on the brochure.

Some of the graphics go well with the text. There are some citations from a variety of sources accurately listed on the brochure.

Only few of the graphics go well with the text. There are a few citations accurately listed on the brochure.

Apprentice (1pt) The brochure's formatting and organization of material are confusing to the reader. The brochure has little of the required information; entries are irrelevant to the chosen topic Most of the writing is not done in complete sentences. Most of the capitalizatio n and punctuation are not correct throughout the brochure. The graphics do not go well with the text. Incomplete citations are listed on the brochure.

Score

Sources

10%

References
Introduction
URL: http://edu.searcheric.org/assessing-student-learning.html

Description of the Authentic Assessment Tools
INTRODUCTION ABOUT PERFORMANCE ASSESSMENT URL: http://www.scribd.com/doc/44751888/Process-OrientedPerformance-Based-Assessment

Semantic Differential Scale
URL: http://www.indiana.edu/~socpsy/papers/AttMeasure/attitude..htm

Checklist
Source: De Guzman-Santos, R., “Educational Assessment II”.

PROCESS-ORIENTED PERFORMANCE-BASED ASSESSMENT URL: http://www.scribd.com/doc/44751888/Process-OrientedPerformance-Based-Assessment

PRODUCT-ORIENTED PERFORMANCE-BASED ASSESSMENT URL: http://www.scribd.com/doc/44904876/Product-OrientedPerformance-Based-Assessment-Part-1

Sign up to vote on this title
UsefulNot useful