You are on page 1of 22

Assessment Solutions

White Paper October, 2007

Assessment Solutions, KSB, NIIT t!

"

Assessment Solutions

Table o# $ontents

Table of Contents....................................................................................................................................2 EXECUTIVE SUMMARY............................................................................................................................4 INTRODUCTION..........................................................................................................................................5 ITEM AUTHORING......................................................................................................................................6 EXAM OBJECTIVES ......................................................................................................................................6 BLUEPRINTING..............................................................................................................................................6 TEST CREATION............................................................................................................................................7 REVIEWING...................................................................................................................................................7 FIELD TESTING.............................................................................................................................................7 HOSTED ONLINE ASSESSMENT ENGINE.............................................................................................9 AUTHORING AND QUESTION BANK MAINTENANCE.....................................................................................9 Practice and Certification Tests...........................................................................................................10 Review / Workflow................................................................................................................................10 TEST CONFIGURATION & DELIVERY ......................................................................................................... Test Generation and Delivery...............................................................................................................11 Test Instr ction.....................................................................................................................................12 ! estion Pa"es.....................................................................................................................................12 Review ! estions..................................................................................................................................1# $eedback and %ints..............................................................................................................................1& 'arks Confi" ration............................................................................................................................1& ( to Gradin"........................................................................................................................................1& I)*ortin" and +,*ortin" Tests.............................................................................................................1& Proctorin".............................................................................................................................................1& REPORTING................................................................................................................................................. ! Individ al -t dent Perfor)ance C.art.................................................................................................1/ Co)*arative (nalysis...........................................................................................................................1/ Individ al -t dent (ctivity Details.......................................................................................................1/ Perfor)ance.........................................................................................................................................1/ SCHEDULING AND ADMINISTRATION......................................................................................................... ! 0ser 'ana"e)ent.................................................................................................................................1/ Test -c.ed le 'ana"e)ent..................................................................................................................1/ +,tensibility..........................................................................................................................................11 ' ltilin" al..........................................................................................................................................11 %i". (vailability...................................................................................................................................11 2*en (PI based (rc.itect re...............................................................................................................11 +ase of C sto)i3ation..........................................................................................................................11 HOSTING .................................................................................................................................................... 7 ONLINE DELIVERY..................................................................................................................................... " TESTING CENTERS...................................................................................................................................... " OTHER UNIQUE ASSESSMENT TYPES......................................................................................................... " PROCTORING............................................................................................................................................... " MEASUREMENT THEORIES AND MODELS................................................................................................... 9 CLASSICAL TEST THEORY.......................................................................................................................... 9 Ite) (nalysis.........................................................................................................................................14 Assessment Solutions, %nterprise earnin& Solutions, NIIT t!
2

Assessment Solutions

Internal Consistency (nalysis...............................................................................................................20 Test Reliability (nalysis.......................................................................................................................20 Test 5alidity (nalysis...........................................................................................................................20 -tandardi3ation and 6or) -ettin"........................................................................................................21 ITEM RESPONSE THEORY #IRT$.................................................................................................................% 2ne7Para)eter 8o"istic 'odel............................................................................................................21 Two7Para)eter 8o"istic 'odel............................................................................................................21 T.ree7Para)eter 8o"istic 'odel.........................................................................................................21 TOOLS AND TECHNOLOGY.........................................................................................................................%% OTHER SERVICES.....................................................................................................................................22 CUSTOMI&ATION AND INTEGRATION..........................................................................................................%% TECHNICAL SUPPORT.................................................................................................................................%% HELPDESK...................................................................................................................................................%% MENTORING................................................................................................................................................%% TEST'PROCESS AUDITING..........................................................................................................................%% PRE'TEST SCREENING.................................................................................................................................%% SCHEDULING AND ADMINISTRATION.........................................................................................................%%

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


'

Assessment Solutions

%(ecuti)e Summar*

Strategy Assessment Design Test Item Development & Management Psychometric Services Technology Integration & Management Assessment Administration

Strategy for diagnostic assessment, formative tests and summative certification !ased on program needs 'lanning for difficulty levels, discrimination re,uirement, statistical plan Item !an" creation !ased on overall strategy, audience and e*posure si+e. Item sun setting strategies 3ield testing for Item characteristics, Test characteristics, relia!ility and validity analyses I1S 2TI compliant test !an". 3le*i!le or"flo s and advanced statistical analysis tools -ver ./00 physical locations for test delivery (osted Test engine, $ata management services

NIITs Assessment Solutions provides a complete range of offerings from strategy and design to implementation and administration to its customers in the Corporate, Education and Government segments. NIIT provides full solutions as ell as components of its offerings to various customers for !oth lo sta"es as ell as high sta"es tests. The products and services include# Assessment Strategy $esign % &lue'rinting Item Authoring Assessment Engine (osting $elivery )eporting and 'sychometric Analysis Scheduling, Administration, (elpdes", 'roctoring, Tech Support

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


+

Assessment Solutions

Intro!uction
Securit* Item Authorin& 5oste! %n&ine 0eli)er*1 A!min Ps*chometric Anal*sis Other Ser)ices

0esi&n BluePrintin& Authorin&1 -e)ie2in& 3iel! Testin& $alibration Item Ban4in&1 0eli)er*

Web7base! authorin&
-an!omi.ation

Online 0eli)er* Testin& $enters Proctorin& 8ultiple 9uestion T*pes Securit* Stan!ar!s $ompliance Securit*

Item Anal*sis -eliabilit* /ali!it* 0escripti)e Statistics


$lassical Test Theor* Item -esponse Theor*

5elp0es4 Tech Support $erti#ication Process Au!it Trainin&1 8entorin&


Sche!ulin&1 A!ministration

Ban4in& b* Ob6ecti)e Sche!ulin&1 0eli)er* -eports $ustomi.atio n

Pre7Test Screenin&

NIIT Assessment Solutions


There is a gro ing need to professionally assess "no ledge, s"ills, a!ilities and attitudes of people in almost every sphere of life 4 particularly in education and at the or"place. In schools and colleges, as it is in professional education, assessments can !e used very effectively as a tool to assess hat learning needs to !e imparted 5 diagnostic assessments6, a tool to aid learning 5formative assessments6 and as a tool to measure the e*tent of learning and hence the effectiveness of learning or teaching process 5summative assessments6. At the or"place, it has !een a common practice to assess the competence of ne hires through tests. Increasingly, employers are assessing, through online testing, the ongoing ac,uisition of s"ills and "no ledge for each ne technology, product, pro7ect, process or client. 8hile this phenomenon started ith the ICT and related industries, the u!i,uity of the Internet and computers has meant that almost every industry is actively em!racing online assessments. A pioneer in computer education, NIIT has !een delivering online tests for nearly a decade to its students in the retail education !usiness. NIITs feature9rich assessment solution comprises# :. .. ;. <. /. Item Authoring (osted Assessment Engine $elivery 'sychometric Analysis -ther Services including (elpdes", Technical Support, Certification 'rocess Audit, Scheduling and Administration.

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


,

Item Authorin&

Item Authorin&
Item authoring is a speciali+ed s"ill hich re,uires !oth 9 formal training as ell as systems and processes for the items to !e scientific and effective. 3or any test on any su!7ect, item authoring follo s a standard process. -n an ongoing !asis, over9e*posed and poor9performing items must !e retired and ne ones added. Test Creation follo s the process outlined !elo #
NIIT Test )evie % 3i*es

-!7ectives from Client

&lueprint Creation

Test Creation

Completed Test

Test Item &an"

)evie !y Client

%(am Ob6ecti)es
In this phase, the client provides o!7ectives that are identified according to the re,uired s"ill level to !e tested through the test. The o!7ectives are further split into specific outcomes of assessment items 5S-A6. The S-As specify the cognitive a!ility to !e assessed.

Blueprintin&
Ne*t, the NIIT Test Design Team does Blueprint Creation. A test designer creates a !lueprint ith the help of the instructors and an analyst and the program chair revie s the !lueprint. The !lueprint is a specification ta!le that determines the configuration of a test. It lays do n rules for the composition of the test. The !lueprint ensures test relia!ility and the definition of# E*am o!7ectives $ifficulty level for each test item Types of ,uestions, such as, multiple choice, se,uencing, or match the follo ing. 'ercentage distri!ution across various a!ility levels for each o!7ective

The !lueprint also ena!les the analyst and designer to decide the eight assigned to a topic or an S-A, hich in turn defines the num!er of test items to !e created for a topic=S-A. The eight assigned to a topic is decided according to the# Importance of a topic to measure the particular a!ility Importance of the topic in the conte*t of the overall assessment The eight assigned to a topic decides the relative importance of the topic and helps define the mar"s to !e allocated to each test item. The !lueprint is developed to ensure that# The eight given to each topic in each test is appropriate, so that the important topics are not neglected. This contri!utes to the validity of the test. The a!ilities tested are appropriate. 3or e*ample there are sufficient ,uestions re,uiring application and understanding of logical reasoning. Assessment Solutions, KSB, NIIT t!
:

Item Authorin&

8eight of topics and a!ilities are consistent from test to test> this contri!utes to the relia!ility of the test.

3inal Blue Print 3rom the !lueprint, applying the testing conditions derives a Test Configuration Ta!le. Testing conditions such as num!er of items in a test, time allo ed, ma*imum mar"s and mar"s assigned to a test item should !e determined after careful consideration. The ta!le consists of the# )andomi+ation strategy 9 )andomi+ation could !e !ased on item difficulty, e*am o!7ectives, or a com!ination of !oth. Item scoring details Negative scoring 5?=N6 Time allocation Num!er of ,uestions Cut score

Test $reation
Test Creation follo s !lueprint creation. This involves the actual riting of the test items and is done !y the NIIT Test Creation Team. To do this, the team# Identifies the difficulty level for each identified S-A !ased on the &looms Ta*onomy of cognitive a!ility Identifies the item type for each S-A !ased on the analysis and the difficulty level Creates each test item !ased on NIIT Test Item creation Standards and Guidelines. These guidelines are !ased on sound Instructional $esign principles and correct use of language.

-e)ie2in&
After the items have !een authored, each item must go through a series of rigorous revie s to eliminate errors, am!iguity and !iases of any "ind. NIIT -e)ie2 After Item Creation, items are revie ed and fi*ed in the NIIT Test Review and Fixes phase. Items undergo a rigorous revie process. Each test item is chec"ed against various parameters to ensure that the right a!ility is tested ith the right test item. -nly those items that clear the revie process are used in a test. )evie s are of the follo ing types# I$ )evie 4 Ensures items are in accordance ith Instructional $esign principles @anguage )evie 4 Ensures clarity of language Technical )evie 4 Ensures items are technically correct. -e)ie2 b* $lient The )evie phase is follo ed !y the Review by Client. In this step, the 'rogram Chair ould revie the items and identify any changes to the items. 3i*es 5if any6 as suggested !y the Client are made !y the NIIT Test Creation Team. The test is no ready for delivery.

3iel! Testin&
-nce the items are ready for deployment, they are put through a field test here a statistically significant num!er of test9ta"ers ho are representative of the final test9ta"er audience respond to all the items in a controlled environment. The results data collected from this round of testing is su!7ected to statistical analysis to assess the difficulty, discrimination and performance of the distractors. 'oor performing items are modified or dropped. Item 8aintenance

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


7

Item Authorin&

-ver time, !ased on e*posure and ho ell each item has performed in the tests, some items need to retired periodically and replaced !y ne items. This is an ongoing activity.

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


;

$ iKS Assessment %n&ine

5oste! Online Assessment %n&ine


NIITs C@iAS -nline Assessment Engine is a e!9!ased comprehensive testing and item !an"ing solution that supports the creation, development, delivery, and management of online assessments. The entire solution is hosted at NIITs $ata Center facility. &eing an open and customi+a!le frame or", C@iAS -nline Assessment Engine can !e implemented ,uic"ly and tailored to meet the specific !usiness needs of any customer. The C@iAS Assessment Engine offers the follo ing features#

Test Item Authorin& # 'rovides or"flo !ased ,uestion authoring functionality ith a !uilt9 in multi9level revie mechanism. Test $on#i&uration# Allo s for generating e*am, !ased on specifications. Interfaces ith Authoring for specification and sends the generated item list to authoring. Test 0eli)er*# Allo s for launching individual candidate test, using the test pac" allocated for the candidate. Interfaces ith authoring and sends the results in a standard format. Per#ormance -eportin&# Interfaces ith all the components a!ove and generates various standard and custom reports for analysis. Test an! $an!i!ate A!ministration# Allo s for scheduling e*ams for individuals or !atches along ith proctor "eys.

Authorin& an! 9uestion Ban4 8aintenance


A ,uestion consists of a stem that presents a ,uestion to the candidate along ith options. The options depend on the ,uestion type. Instructional $esign e*perts have different theories for evaluating or testing a user. This gives rise to a situation here individual e*perts come out ith separate designs of ,uestions that they ould li"e to as" a user. The solution supports various types of ,uestions as descri!ed later. These ,uestion types are e*haustive enough to fulfill most Instructional $esign re,uirements. 3or each ,uestion, the author can add feed!ac". 3eed!ac" is entered at the option level. 8hen the learner selects the rong option, the feed!ac" is given to improve the learning e*perience. 3eed!ac" can !e disa!led for certification tests. The ,uestion editor provides formatting features li"e !old, italics, underline, te*t alignment, ordered list 5se,uence6, unordered list 5&ullets6, ta!les, font si+e, font face, font color, superscript and su!script. In addition, special sym!ols, images, hyperlin"s, audio, and video clips can !e attached to the ,uestion. The system stores the stem and options in an encrypted form to protect data from unauthori+ed access. There are some parameters that are common to all ,uestions, including positive and negative mar"ing, difficulty levels, hints, and e*piry dates for a ,uestion. The num!er of permissi!le attempts for a ,uestion is also a configura!le item. The assessment engine provides ready templates for the follo ing 2uestion Types# 1ultiple Choice Single Select 1ultiple Choice 1ultiple Select True and 3alse 3ill in the !lan"s 5Te*t and Numeric6 1atch the 3ollo ing
<

Assessment Solutions, KSB, NIIT t!

$ iKS Assessment %n&ine

3ree te*t=Essay Type 5Su!7ective response6

The a!ove ,uestion types can !e presented to the user in different presentation styles. The assessment engine maintains t o sets of information for the ,uestions, one is the type of ,uestion it represents and the other is the corresponding presentation format. Additional ,uestion types can easily !e incorporated into the engine depending on re,uirements of the organi+ation=university. Practice an! $erti#ication Tests A Test can !e defined as a 'ractice Test or a Certification Test. 'ractice tests ena!le the user to chec" their understanding of the su!7ect and pursue remediation depending on the feed!ac". 3eed!ac" is displayed for each ,uestion in a practice test. Certification is an ac"no ledgment of the s"ills possessed !y an individual. It involves evaluating the s"ills a person possesses and providing a result along ith feed!ac" for the same. Tests can !e configured in !oth static and randomi+ed modes. In a static test, all candidates ta"ing the test get the same set of ,uestions. In a randomi+ed test, the ,uestion set presented to each candidate is uni,ue. The assessment engine can pic" up ,uestions for a test !ased on the test configuration from a large ,uestion !an". The system allo s creating sections ithin a test. Se,uences ithin a section can !e predetermined along ith specifications for distri!ution of ,uestions and online analysis !ased on statistics. It is also possi!le to configure sections so that a user is allo ed=not allo ed to move ahead to the ne*t section ithout successfully completing a section. -e)ie2 1 Wor4#lo2 Content is al ays revie ed !efore it is allo ed for pu!lication. C@iAS provides the 8or"flo module to facilitate the online revie and correction of content !efore it is pu!lished. 8hen a user creates a content piece such as an item or a test, the engine re,uires it to !e revie ed and approved through a defined, 5configura!le6 or"flo process !efore it is pu!lished. After the initial creation, the content ould then follo a specified path and go through the levels of revie =edits. At each level, other roles have specific 7o!s to perform on the item. )evie ers can send !ac" the item to the creator for changes. Alternately, items can !e for arded to the ne*t level revie er till their final approval. The 8or"flo module supports the ;)s, B)outes, )ules, and )oles. )oute is the path that content ta"es hile under going revie . The path has levels in it, hich are assigned to appropriate roles. )oles are the system roles, hich are assigned to act upon the content. )ules are the conditions specified hile setting up a or"flo cycle to define the decisions and actions that a revie er can ta"e on the content at different levels. process in C@iAS.

Together these three )Cs facilitate the functioning of the or"flo

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


"0

$ iKS Assessment %n&ine

Test $on#i&uration = 0eli)er*


The screen sho n in the figure !elo is a launch page for an Assessment. In this scenario, the assessment is em!edded ithin a course and its learning plan. It is also possi!le to create assessments as independent activities or standalone assessments.

3igure# The Course $ash!oard Screen

Test >eneration an! 0eli)er* 8hen a candidate starts a test, a test configuration present in the appropriate test pac" is randomly pic"ed up and assigned to the student. Thereafter, the test9ta"er is delivered the test using the same set of ,uestions. The system searches for ,uestions in the 2uestion 'ool depending upon the configuration of the test assigned to the candidate. The system processes ,uestions in !atches from the generated ,uestions pool and displays them on the screen. The system displays the first set of ,uestions as soon as the Testing Engine selects them. 8hile the test9ta"er attempts these ,uestions, the engine "eeps selecting further ,uestions and caching them. 3urther ,uestions are displayed to the test9ta"er from this cache. This improves the performance, as the test9ta"er need not ait till the engine has selected all the ,uestions for the test. Also, the system need not ait for the student to attempt the displayed ,uestion to fetch the ne*t ,uestion from the ,uestion pool.

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


""

$ iKS Assessment %n&ine

Test Instruction The Test starts ith a set of instructions. The figure !elo , displays the attri!utes=parameters of the test as they appear on the instructions screen .

3igure# Instructions for Assessment screen

9uestion Pa&es Each ,uestion in the assessment is displayed on its o n page, and each ,uestion page includes the follo ing !uttons# -e)ie2# To revie and revise all ,uestions in the section. 1a"e sure that all ,uestions are DAttemptedE !efore pressing the %n! Assessment !utton. Section ist# To move to the ne*t section, once you have attempted all ,uestions in the current section. Instructions? To return to the initial instructions page. Pre)ious 9uestion# To go !ac" to the previous ,uestion. Submit Ans2er# To su!mit your ans er and go to the ne*t ,uestion. S4ip 9uestion# To s"ip the current ,uestion and go on to the ne*t ,uestion. Fn9attempted ,uestions ill !e mar"ed incorrect, so ma"e sure that you attempt all the ,uestions !efore clic"ing %n! Assessment. ?ou can revie and revise all ,uestions in this section !y clic"ing -e)ie2 9uestions. %n! Assessment? To !e used hen you have revie ed, ans ered and su!mitted an ans er for every ,uestion of the assessment. ?ou can confirm that all the ,uestions have !een attempted !y clic"ing the D)evie SectionE !utton.

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


"2

$ iKS Assessment %n&ine

3igure# The Assessment Screen

9uestion T*pes
The Assessment Engine has the a!ility to incorporate a diverse set of ,uestions that ould meet the re,uirements of most of the Instructional $esign e*perts. 3ollo ing ,uestion types are supported in the system through readily availa!le templates# 1ultiple Choice Single Select 1ultiple Choice 1ultiple Select True and 3alse 3ill in the &lan"s 5Te*t and Numeric6 1atch the 3ollo ing 3ree te*t= Essay Type 5Su!7ective response6

-e)ie2 9uestions The )evie 2uestions page, as sho n in the follo ing figure, displays all the ,uestions and indicates hether or not an ans er has !een su!mitted for each ,uestion. This page displays the status of all the attempts made=not made !y the learner, and allo s him to "eep trac" of progress during the assessment.

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


"'

$ iKS Assessment %n&ine

3igure# The 2uestions )evie 'age on the Assessment Screen

3ee!bac4 an! 5ints The assessment engine ena!les a test creator to provide feed!ac" on the responses of the learners. The feed!ac" corrects the learners and reinforces the concept. The assessment engine also ena!les a test creator to provide hints to the learners hile the learner is ta"ing a test. In a certification test, it is possi!le to penali+e learners if they ma"e use of the hints. 8ar4s $on#i&uration This ena!les a test creator to configure mar"s at the time of test creation. The learners can either !e mar"ed in grades or percentage. The test creator can configure negative mar"s for each rong response, full mar"s or no mar"s for a particular ,uestion, and the mar"s can !e !ased on the rendering type of the ,uestion or on its difficulty level. This gives fle*i!ility for the mar"ing system and discretionary po er to the test creator. Auto >ra!in& A test creator can select auto grading of the test. In such a case, the test creator need not configure the mar"ing at the time of creating a test. The learner ill !e automatically graded according to the default mar"s set in the system. Importin& an! %(portin& Tests The Assessment engine is I1S and 2TI compliant. (ence tests and ,uestions can !e easily imported from and e*ported to other systems that are standards compliant, promoting reuse of e*isting content. Proctorin& In a certification test, the candidates need to ta"e the test in a controlled environment here their identity is co9signed !y the invigilating authority assigned for the test. The delivery engine has an interface for the proctor to co9sign candidates for the test. The proctors are assigned to the tests hen the test is !eing scheduled. C@iAS provides the facility to proctor the test in the follo ing mechanisms# Individual Co9signing In this form of proctoring, the assessment engine launches a screen to accept the login credentials as the candidates start a test. This is applica!le only on the tests here proctoring has !een ena!led hile configuring it. Another mechanism of individual cosigning is !y hich an invigilator is provided a uni,ue random "ey for every e*amination. The invigilator can rite or announce the "ey to the students in the e*amination hall. Assessment Solutions, %nterprise earnin& Solutions, NIIT t!
"+

$ iKS Assessment %n&ine

1ass Co9signing or Group 'roctoring Fsing this mechanism a proctor can co9sign a group or !atch of students attempting a test from a central location using an interface provided !y C@iAS. This interface provides a screen that displays the students ta"ing an assessment. After the proctor has verified the student, all the students can !e selected and co9signed.

-eportin&
The purpose of 'erformance Trac"er is to generate various reports to ena!le analysis of performance, evaluate the outcomes of students learning and, their assessments. It is also possi!le to generate administration reports for monitoring the usage and details of "ey information setup in the system. In!i)i!ual Stu!ent Per#ormance $hart This report sho s the performance of a Bsingle student in a graphical format. The report sho s the score of a student in various tests. This report ena!les to vie and compare the performance of a student across various tests in a module. The intended end user of the report is the teaching staff, ho can revie the performance of a student in their o n !atches. $omparati)e Anal*sis This report sho s the trend of the !atch for test in a graphical format. The report sho s ho the students are performing in a particular test. This report ena!les to vie and 7udge the average scoring a!ility of candidates. The report presents the score in !loc"s of :0 along the *9a*is and the percentage of candidates ho fall under each !loc" along the y9a*is. In!i)i!ual Stu!ent Acti)it* 0etails This report lists the details of activities for a candidate in a !atch in a ta!ular format. The intended end user of the report is the teaching staff. The teaching staff can revie the details of online tests ta"en !y students. The report allo s the teaching staff to choose a student from their !atches and vie the process. Per#ormance C@iAS can support large num!er of concurrent users, ena!les streaming of digital content, supports concurrent streams, and manages !and idth using defined policies and priorities.

Sche!ulin& an! A!ministration


@ser 8ana&ement C@iAS provides online registration of users in the system and can assign them to desired roles. It provides the fle*i!ility to assign multiple roles to a user ho may !e re,uired to underta"e the assigned tas"s. 3urther, C@iAS supports a scenario here a user needs to !e provided permission to a specific function of a different role rather than all functions of the concerned role. 'ermission to an individual function in the C@iAS system can !e revo"ed e*plicitly !y overriding previously assigned roles. This provides the fle*i!ility to the administrator to grant or revo"e permissions to users at the function level. Test Sche!ule 8ana&ement The tests for the candidates ta"ing certification or practice tests are scheduled in advance. The test details, time, and duration of the test are provided along ith the list of candidates ho ould ta"e the test. Candidates log on to the system using the assigned user identification and pass ord. After a candidate logs on to a system and if the current time falls ithin the stipulated time for the test, a lin" is displayed for the candidate to start the tests. The tests can !e re9 scheduled for candidates ho are una!le to ta"e the tests on the allotted time.

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


",

$ iKS Assessment %n&ine

The tests, after !eing configured and pu!lished can !e delivered to the candidates using the delivery engine. The delivery engine receives the candidates responses and evaluates the results. There results are stored as a part of the studentCs assessment records. An important feature of assessment engine is its a!ility to "eep trac" of individual ,uestion papers. A learner can a!andon a test mid ay and can resume it at a later date. The engine "eeps trac"s of the ,uestions given to the learner in the incomplete test and the options that had !een mar"ed !y learner. Caching is another important feature of the -nline Testing Engine. 2uestions and their attri!utes are do nloaded and cached in the local machine hile the person is ans ering the first ,uestion. This process is done in !ac"ground ithout affecting the display. This improves system performance, as the learner need not ait for the ,uestions to !e do nloaded. Time for the test is calculated as real time and it does not include the do nload time. This means that lo or high !and idth does not affect the time allo ed to ta"e the test. %(tensibilit* C@iAS is a component9!ased system, hich allo s a C@iAS component to !e replaced ith any e*ternal system availa!le in the mar"et. The system is open to learning, colla!orative, and "no ledge !ase tools developed !y other vendors. This is achieved !y the implementation of open A'Is. Any e*ternal system can interface ith the e*isting components using the A'Is provided for the component. 8ultilin&ual C@iAS supports multi!yte FT39G character set that allo s it to !e used for most of the languages in the orld, including Asian languages, such as li"e Hapanese, Chinese, and Aorean. The data stored in the data!ase, field, la!els, and error messages can !e multilingual. 5i&h A)ailabilit* The scala!le architecture of C@iAS allo s configuration for using more than one Application servers and 8e! servers. In such a case, the hole application ill !e availa!le to users even hen one of the servers is not availa!le. Open API base! Architecture C@iAS follo s the open A'I !ased architecture, hich allo s easy integration ith e*ternal systems in an enterprise. The ro!ust design enforces that all the programs in any module can access the data of their o n modules. 3or accessing the data of any other modules the pu!lished A'Is are used. This allo s any other application system to get=pass data to C@iAS modules !y calling the pu!lished A'Is. This allo s rapid integration ith e*isting applications in the organi+ation. The system is also compati!le ith @$A' standards allo ing it to !e integrated standard application to achieve Single Sign -n through an enterprise portal. ith any other

%ase o# $ustomi.ation C@iAS allo s easy customi+ation !ased on organi+ational needs. All static te*t in the system 5screen titles, field la!els, error messages, etc.6 is retrieved from a multilingual file. This allo s for rapid modifications to screen names, field names, and other FI elements, thus eliminating the need for any re9programming.

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


":

$ iKS Assessment %n&ine

C@iAS is a rule9!ased system, here many rules can !e defined for each function at the time of configuration. This once again offers tremendous fle*i!ility during implementation ithout any need for re9programming. The loo"9and9feel and other changes in organi+ational policies can !e effected in an e*tremely short duration. Technical o)er)ie2 o# the $ iKS Assessment %n&ine? H.EE Application server 5'ramati6 -racle )$&1S Scala!le, load !alancing ready, multilingual architecture. 3ully Internet !ro ser !ased 'orta!le on all -S including @inu* 5)ed (at6 Allo s ;9tiered implementation here e! server, application server % data !ase servers can !e separated !y fire alls.

5ostin&
NIITs offers a fully hosted and managed e@earning services environment for production facility and appropriate infrastructure necessary for your technical re,uirements and !usiness strategic goals. NIIT offers scala!ility, security and transparency of a hosted e@earning infrastructure, ithout re,uiring investment in hard are and soft are evaluation, deployment and maintenance. 3eatures? II./J Server uptime guarantee .<*K application and systems monitoring System, data!ase and net or" administration &ac"9ups )edundant and high resilience Internet connectivity Security 3ully managed staging environment )apid application deployment Scala!le storage Application Setup NIIT ta"es care of the installation of (ard are, soft are and the application. NIIT ta"es a ay your orry to (ire, train any additional IT staff to setup the soft are or configure the hard are. A)ailabilit* NIIT provides II./J uptime for your e@earning application ith an a!ility to continue uninterrupted services.

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


"7

Test 0eli)er*

Ph*sical Test 0eli)er*


NIIT has an e*tensive and gro ing net or" of education and test centers in more than ;0 countries around the orld.

Online 0eli)er*
NIIT provides delivery of online tests to a num!er of its education and corporate customers around the orld. These tests are scheduled and delivered at locations of choice of the customer. These tests are used for a variety of purposes# Education# :. Entrance into a program .. As part of courses, to aid in the learning process ;. -ngoing evaluation in various courses <. 3inal e*ams=graduation Corporate :. 3or hiring ne employees .. Assessing the impact of training programs ;. 're9promotion assessment of a!ilities and !ehavioral aspects

Testin& $enters
NIIT is setting up a net or" of dedicated testing centers across India to conduct tests in large num!ers to manage scale for its customers. These centers, !esides providing a large facility in various cities for conducting over :,000 tests per day each also have other features for doing pre9 assessment screening and post9test intervie ing. The testing centers also have the capa!ility for conducting various ne types of assessments including automatically assessing language, voice and accent s"ills through patent9pending applications developed !y NIIT.

Other @niAue Assessment T*pes


NIIT is continuously investing in research and development to improve the effectiveness of computer assisted assessments. In a significant change, NIIT relies less on multiple choice ,uestions hich only test the general a!ilities or theoretical "no ledge of test9ta"ers and not their suita!ility for a specific 7o!9role. NIIT no has several ne types of assessments hich significantly improve the effectiveness of the assessment process. Some e*amples of ne item=assessment types are# Computer Screen Simulations Computer assisted )ole9plays Automated Loice and Accent assessment -nline programming s"ills testing

Proctorin&
NIIT provides a proctored environment for conducting high9sta"es assessments in its dedicated testing centers. These centers are e,uipped ith closed9circuit cameras and physical proctoring !y trained proctors.

Assessment Solutions, KSB, NIIT t!

";

Assessment Solutions

Ps*chometric Anal*sis
'sychometric Analysis is an integral part of NIITs assessment solution. 'sychometric analysis is performed at t o levels# After field testing# in order to evaluate item performance and for 5re6cali!rating items for difficulty. This forms an integral part of item authoring process. -n an ongoing !asis# to continuously evaluate performance of tests and individual items ithin the tests.

8easurement Theories an! 8o!els


NIIT uses a variety of models for doing psychometric analysis. The model used is selected in close discussion ith the customer and also depends on the nature of tests, types of items and num!er of test9ta"ers. The popular models used include# Classical Test Theory 5CTT6 Item )esponse Theory 5I)T6 o )asch 1odel 5: parameter logistic model or : '@16 o T o9'arameter @ogistic 1odel 5.'@16 o Three 'arameter @ogistic 1odel 5;'@16

$lassical Test Theor*


Item Anal*sis $uring the construction of tests, test developers need to determine the effectiveness of each individual item in the test. This process of evaluating each item is called item analysis. -n the !asis of the information derived from item analysis on test items effectiveness, they can !e modified or eliminated from the final test, thus ultimately ensuring the relia!ility and validity of the test. Three important item functions that need to !e evaluated through item analysis are# :. Item $ifficulty .. Item $iscrimination ;. $istractor Efficiency Item 0i##icult* Anal*sis Item difficulty is a measure of the overall difficulty of a test item. In CTT, it is represented as a percentage of test ta"ers ho ans er an item correctly. An easy item is one that is correctly ans ered !y a ma7ority of test ta"ers. Conversely, a difficult item is ans ered correctly !y only a fe test9ta"ers. Typically, an effective test must have a !alanced distri!ution of items of varying degrees of difficulty. 8ith a Ddifficulty valueE assigned to each item, it is easy for the test developer to select items of varying difficulty levels for inclusion in the final test. Item 0iscrimination Anal*sis The most important function of a test is to differentiate !et een test ta"ers in terms of their a!ility, "no ledge, s"ills and personality. Therefore, it is crucial that the !uilding !loc"s of the test, i.e., the items, have the a!ility to differentiate, say, !et een high and lo performers on an arithmetic test. Through item discrimination analysis, it is possi!le to evaluate the performance of an item ith respect to its a!ility to discriminate !et een people of varying a!ilities or traits. 8ith a definite discrimination inde* for each item, test developer can select those items ith higher discrimination indices for the test. Assessment Solutions, KSB, NIIT t!
"<

$ iKS Assessment %n&ine

'oint9&iserial and E*treme Group 1ethods are commonly used approaches to arrive at the $iscrimination Inde* for individual items. &esides providing a detailed reports on discriminative po er of individual items, interpretative comments on the psychometric implications of discrimination indices of each item are shared ith test developers to facilitate the test revie process. 0istractor %##icienc* Anal*sis In the case of tests that use multiple9choice items, the incorrect ans er choices to a ,uestion 5distractors6 play an important role. A DgoodE distractor, hich is unam!iguously incorrect and yet can confuse the less "no ledgea!le test ta"er, adds to the discrimination value of the ,uestion. The effectiveness of distractors is analy+ed !y measuring the distri!ution of responses across all the distractors. $istractors that are not selected at all or often enough !y test9ta"ers may !e discarded= modified= replaced to improve the efficiency of the item. 'oint9&iserial 1ethod is used to compare !et een the group that chooses a distractor and the group that chooses the correct option. A distractor efficiency matri* is prepared for all items in a test and recommendations provided to the item developers. Internal $onsistenc* Anal*sis Items in a test must have internal consistency in measuring the proposed construct or varia!le. That is, items that are chosen for a test, designed to measure a particular a!ility or trait, must assess only that a!ility or trait and, therefore, have high correlation among themselves and ith the test. In effect, information from such analysis is essential to "no the internal structure of the test and to ma"e decisions on the need to further enhance the ,uality of the test. 8ith high internal consistency indices, the test developer can confidently rely on the tests a!ility to assess the proposed a!ility or trait. The internal consistency of a test is measured !y means of different statistical tools. Common tools used to analy+e internal consistency are Cron!achs Alpha Techni,ue, A).0 Coefficient and Spearman9&ro n Alpha. Test -eliabilit* Anal*sis A good test needs to !e consistent in its performance. That is, the same test given to a candidate today should produce identical results a fe ee"s or months later hen the candidate ta"es it again. In a similar ay, t o or more parallel tests that assess the same a!ility or trait must sho similar=identical results. In the psychometric analysis of a!ility=personality=s"ills tests, t o different methods are used to evaluate the relia!ility of tests 9 Test9)etest )elia!ility analysis and Alternate93orm or Split9(alf )elia!ility Analysis. Test /ali!it* Anal*sis It is a!solutely essential that a test measure hat it as originally supposed to measure. At various stages of its development, a test needs to !e evaluated for its validity. Lalidity of a test is assessed in different ays# 3ace validity, content validity, concurrent validity, predictive validity, and construct validity.

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


20

$ iKS Assessment %n&ine

Stan!ar!i.ation an! Norm Settin& The process of standardi+ation of a test ensures the representativeness of the test to the target audiences. It ena!les the test to !e administered and scored under uniform conditions so that the test can produce compara!le results across different situations and target audiences. As part of the process, norms and !enchmar"s for different groups and situations are set for an un!iased comparison of individual scores on the test. According to the specific re,uirements of the customer, the team of psychometricians create standard rules for the administration, scoring and interpretation of the test. Standardi+ed scores, such as percentiles, T9scores and STEN scores, and group9specific comparison norms or !enchmar"s are developed for accurate interpretation of individual test scores.

Item -esponse Theor* BI-TC


Item )esponse Theory has a different approach and style from CTT hile analy+ing the psychometric properties of a test and its items. 8hile CTT evaluates a test and its items against a normative population on hich they are psychometrically analy+ed and standardi+ed, I)T analyses are not !ased on any such relative measures. It uses comple* pro!a!ilistic models to arrive at the psychometric properties of test items. It assumes that latent varia!les such as a!ility and personality traits have an underlying measurement scale, on hich individual items, tests and individual test ta"ers can !e compared. I)T is a statistical procedure used to model item responses ith certain parameters to determine the proficiency level of an e*aminee. I)T computes an estimated item characteristic curve 5ICC6 for each test item. I)T can use one to three parameters to specify the item response model. I)T has the follo ing advantages hen compared to classical item analysis# Item parameter estimates are independent of the group of e*aminees selected from the population for hom the test as designed. E*aminee a!ility estimates are independent of the particular choice of test items used from the population of items that ere cali!rated. The precision of the a!ility estimates is "no n. One7Parameter o&istic 8o!el This I)T model uses only the difficulty parameter for its analysis. Analysis results include item9 specific information in the form ofEItem Characteristic CurveE as ell as test9specific information in the form of a DTest Characteristic CurveE. The location of the curve on the latent a!ility scale represents the difficulty of an item. This model ta"es only the difficulty parameter for the purpose of analysis> it assumes a fi*ed value for the discrimination parameter. T2o7Parameter o&istic 8o!el The t o9parameter I)T model utili+es !oth the difficulty and the discrimination parameters for the psychometric analysis. The resultant Item Characteristic Curve tells a!out the difficulty level as ell as the discrimination po er of the test item. 8hile the location of the curve on the a!ility scale represents item difficulty, the height or steepness of the curve indicates the discrimination inde* of the item. Three7Parameter o&istic 8o!el In addition to the difficulty and discrimination parameters, the Three9'arameter 1odel ta"es a third dimension into consideration, i.e., the guessing parameter. This third parameter indicates the pro!a!ility of getting the item correct !y guessing alone. Therefore, the Item Characteristic Curve resulting from this analysis has information a!out item difficulty, item discrimination and the DguessabilityE.

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


2"

$ iKS Assessment %n&ine

Tools an! Technolo&*


The Assessment 'ractice of NIIT has !uilt its o n, proprietary soft are tools for performing statistical analysis !ased on the various models of I)T.

Other Ser)ices
NIIT provides a range of other services related to assessments to its customers around the orld.

$ustomi.ation an! Inte&ration


3or medium to large implementations of the C@iAS Assessment Engine, NIIT provides customi+ation of the engine for its customers. These include# Fser Interface 3unctionality )eports Since C@iAS uses open A'Is and supports industry standards, it is also possi!le to interface=integrate C@iAS ith other internal systems that customers may have. Single Sign9-n, E)' and client () systems are some e*amples of applications that C@iAS may !e interfaced ith.

Technical Support
NIIT provides technical support to its customers and test delivery partners to help resolve any technical ,uery.

5elp!es4
Since many test9ta"ers are still ta"ing online tests for the first time, they may need some hand9 holding. NIIT has .<*K toll9free helpdes"s that provide this service to our test sponsors.

8entorin&
8hen tests are used as a practice or preparation mechanism, test9ta"ers need to discuss their performance ith someone ho is familiar ith !oth 4 the test items as ell as the su!7ect area. NIIT provides this service to several of its customers through phone, email and chat.

Test7Process Au!itin&
Some of our customers use our technology to deliver certification tests through a net or" of their partners=franchisees. In such cases NIIT provides a service to audit the test delivery process through scheduled visits as ell as through mystery shoppers at test delivery locations. This helps our customers to maintain a high level of integrity in their testing process.

Pre7test Screenin&
3or certain "inds of tests, e.g., pre9hire assessments, test sponsors may have a set of criteria for screening candidates for eligi!ility for the test. NIIT provides !oth 9 manual as ell automated options for screening.

Sche!ulin& an! A!ministration


8hen physical test delivery 5for proctored tests6 is a constraint, tests need to !e scheduled and administered in a manner that optimi+es the availa!ility of all resources. NIIT provides services for doing this tas".

Assessment Solutions, %nterprise earnin& Solutions, NIIT t!


22

You might also like