You are on page 1of 7


Testing productive skills


Performance on language testing is affected by a wide variety of factors and

an understanding of these factors and how they affected test scores is
fundamental to the development and use of language tests.


language testing specialists have probably always recognized the need base
the development and use of language tests on a theory of language
proficiency. For example, Carrol( 1961,1968;Lado,1991) recently they have




communicative language







proficiency is with the methods and technology

involved in measuring it( Upshur,1979;Henning, 1994; Banch man and Clark,

1987.) The frameworks presented her and the next constituent an initial
response to this call and reflect my convention that if we are to develop and
use speaking ng tests appropriately. Testing productive skills measure
learners their oral proficiency or communication ability with the target
language. Testing productive skills are measured not only by speaking but
also writing that is why called productive skills both realized that the
competence the learners have ingenerate ideas from their prior knowledge
or what they shared ideas in their actual class room. Therefore learners
productive skills are developed if the learners are tested continuously; they
developed completion and sharing experience of production of a number of
ideas weather in speaking or writing. Generally testing speaking skills are
assessing the communicative competence or oral proficiency language skills
and strategies of grammar and vocabulary use by scoring and grading

Construction of effective exams

(Davis, 1993) stated the following constructions of effective exams:

1. Prepare new exams each time you teach a course: though it is time
consuming to develop tests, a past exam may not reflect changes in
how you have presented the material which topics you have
emphasized in the course.
2. Makeup test items throughout the term: one way to make sure the
exam reflects the topics emphasized in the course is to write the test
questions the end of each class session and place them on index cards
or computer files for later sorting.
3. Ask students to submit test questions: factually who use this technique
limit the number of items a student can submit and receive credit for.
4. Cull items from colleagues exams: ask colleagues at other institutions
for copies of their exams.
5. Consider making your tests cumulative: cumulative tests require
students to review material they have already studied, thus reinforcing
what they have learned and give students a chance to integrate and
synthesize course content (Crooks, 1988; Jacobs and
6. Prepare clear instructions: test your instructions by asking a colleague
to read them.
7. Include a few words of advice and encouragement on the exams:
Example give the students advice on how much time to spend on each
section or offer a hint at the beginning of an essay question or wish
students good luck. (<<Exams: Alternative Ideas and
8. Put some easy items first: answering easier questions helps students
to overcome their nervousness and may help them feel confident that
they can succeed on the exam. ( Savitz,1985)
9. Challenge your best students: include at least one very difficult
question though not a trick question or a trivial one to challenge the
interest of the best students.
Tryout the timing : no purpose is served by creating a test too
longer for even well-prepared students to finish and review before
turning it in.( Mckeachive,1986)
Give some thought to the layout of the test: use margins and line
spacing that make the test easy to read.

1.1 Testing writing

1.1.1. Introduction

The ability to write effectively is becoming increasingly important in

our global community, and instruction in writing is thus assuming an
increasingly role in both second and foreign language education
where ever the acquisition of a language skill is seen as important it

becomes equally important to test that skill, and writing is no

exception. Thus, as the role of writing in second language education
increases, there is an ever greater demand for valid and reliable ways
to test writing ability, both for class room use and as a predictor of
future professional or academic success (Weigle.c2002).
What does it mean to test writing ability? A common-sense answer to
this question is that the best way to test peoples writing ability is to
get them to write (Hughes1989).
According to Hughess decisions about designing assessment tasks or
scoring the procedures consider a number of key questions.
What are we trying to test? That is how are we defining writing ability
for the purpose of test-are we interested primarily in grammatical
sentence, or writing for a specific communicative function.
Why do we want to test writing ability? What will we do with the
information that we get from the test?
Who are our test takers? What do we need to know about them in
order to design tasks that allow test takers to perform at their highest
Who will score the tests, and what criteria or standards will be used?
How can we ensure that rates apply the scoring standards
Who will use the information that our test provides? In what form will
the information be the most useful?
What are the constraints (of time, materials, money, and labor) that
limit the amount and kind of information we can collect about the test
takers writing ability.
What do we need to know about testing to make our test valid and

The relationship between speaking and writing

The relationship between speaking and writing is important for
language testing. For example Brown (1994) provides the following
characteristics that ordinarily difference between written languages
from spoken languages.
-is transitory and must be processed in real time.
-speakers must plan, formulate and deliver their utterances within a
few moments if they are to maintain a conversation.
-spoken language tends to have shorter clauses connected by
coordinators, as well as more redundancy (repetition of nouns and

-because of the social and cultural uses it is less formal than writing.
-is permanent and can be read and reread as often as one likes.
- Writers generally have more time to plan, review and revise their
words they are finalized.
In an extensive review of literature on speaking or writing connections
Sperling (1996) concludes that:
To talk of written and spoken language differences is to consider the
range of communicative purposes to which either writing or speaking is
In this sense, broader characteristics such as what gets said and what
remains implicit, what is for grounded and what is backgrounded and
what is stated by whom and under what circumstances implicate the
norms and expectations of the range of contexts in which both writing
and speaking are produced(Spriling,1996) cited in Weigle.2002:17.

Basic considerations in assessing writing

1. Test purpose:
making inferences and making decisions
Making inferences: since we cannot directly observe a persons
language ability, we use her or her responses to test items as data
from which we make inferences about the ability that underlies the test
These inferences are then used as data for making a variety of
decisions at an individual, class room, or program level.
As Bachman and Palmer (1996) note, an important aspect of decisions
made on the basis of inferences about language ability is whether they
are high-stakes or low- stakes decisions. High-stakes decisions have a
significant impact on the lives of individuals or on programs, and are
not easily reversed, so that errors on these decisions can be difficult to
correct. For example, college admissions, the awarding of funding to
schools based on text results.
Low-stakes decisions have a relatively minor impact on individuals
and programs, and error in these decisions tends to have less drastic
consequences. For example, placement in to one of a series of
language courses in intensive English program.

2. Language use and language test


The essential components of language knowledge and strategic competence

are summarized by Dougl, 2000 as the following:
Language use or knowledge:

Grammatical knowledge, vocabulary, morphology and syntax,

Textual knowledge: cohesion, rhetorical or conversational organization.
Functional knowledge: ideational, manipulative, heuristic and
imaginative functions.
Socio linguistic knowledge: dialects/varaiets, registers, idiomatic
expression and cultural references.

Strategic competence or language test performance

- Assessment: evaluating communication, situation or test task and
engaging an appropriate discourse domain.
- Goal setting: deciding how and whether to respond to the communicative
- Planning: deciding what elements of language and background knowledge
are required to research the established goal.
- Control of execution: retrieving and organizing the appropriate elements of
language knowledge to carry out the plan.
3. Writing as performance assessment
The performance assessment is used to describe any assessment procedure
that involves either the observation of behavior in the real world or a
simulation of a real-life activity-(i.e. a performance of the ability being
assessed and the evaluation of the performance by raters.) (Weigle, 2002.)
McNamara (1996) provides a useful distinction between a strong sense and a
weak sense of performance assessment in language testing.
In a strong sense of term: the focus of a performance assessment is on the
successful of completion of a given task that requires language use and not on the
language and not on the language use it self.
In the weak sense of the term: the focus of the assessment is on the language used,
not on the fulfilment of the task. Tasks used to elicit language may resemble realworld writing tasks, but the purpose is to display language proficiency not the ability
to persuade (apologize).

4.Test usefulness
Bachman and Palmer(1996) cited in Weigle,2002 :48 present the following
three guiding principles for considering the qualities

1.12. Background
Not many centuries ago, writing has a skill that was the exclusive domain of
scribe, and scholars in Educational or religious instituting. Almost every
aspect of everyday life for common people was carried out orally, business
transactions, records and legal vocation. It was to render language into the
written word. Today the ability to write has become an indispensable skill in
our global literate community. According to brown, 2004 writing skill is at
least at rudimentary levels, is a necessary condition for achieving
employment in many walks of and is simply taken for granted in literature
cultures. He also states that writing is not simple task. As a teacher
assessing students writing ability as usual someone need to be clever about
his objective or criterion. What is someone wanted to set, hand writing
ability, correct spelling, writing sentences that are grammatically correct,
paragraph construction, logical development of main idea, all of these and
more are possible objectives and each of the objectives can be assessed
through a variety of tasks which we will be examined.

1.1.2 Reasons for writing tests

Parrot, 1993 stated the following reasons for witting tests.
a. For diagnostic purposes
b. To develop linguistic competence (for example copying a model of new
language which has been taught.)

c. To encourage the development of fluency.

d. To train or provide practice in aspects of writing skills. For example
Selecting characteristics of and features of particular test types
according to the purpose of writing.
Including appropriate stages in the process of composition.
Assessing the knowledge, assumptions, attitudes and interest of
the intended audience and addressing them accordingly.


Testing Speaking

Testing both informally and formally takes place of the beginning and at the
end of most language courses as well as at various times during the course
itself. Scott------ at placement an assessment of learners speaking skills can
be done by means of an interview that includes no spoken component
provides an in adequate basis for any test of overall language proficiency
weather it aims to test progress during the course or achievement at the
ends of it.
As Scott stated, an oral component in test is that it considerably complicates
procedure, both in terms of its practicality and the way assessment criteria
can be reliably applied.