Professional Documents
Culture Documents
"
Assessment Solutions
Table o# $ontents
Table of Contents....................................................................................................................................2 EXECUTIVE SUMMARY............................................................................................................................4 INTRODUCTION..........................................................................................................................................5 ITEM AUTHORING......................................................................................................................................6 EXAM OBJECTIVES ......................................................................................................................................6 BLUEPRINTING..............................................................................................................................................6 TEST CREATION............................................................................................................................................7 REVIEWING...................................................................................................................................................7 FIELD TESTING.............................................................................................................................................7 HOSTED ONLINE ASSESSMENT ENGINE.............................................................................................9 AUTHORING AND QUESTION BANK MAINTENANCE.....................................................................................9 Practice and Certification Tests...........................................................................................................10 Review / Workflow................................................................................................................................10 TEST CONFIGURATION & DELIVERY ......................................................................................................... Test Generation and Delivery...............................................................................................................11 Test Instr ction.....................................................................................................................................12 ! estion Pa"es.....................................................................................................................................12 Review ! estions..................................................................................................................................1# $eedback and %ints..............................................................................................................................1& 'arks Confi" ration............................................................................................................................1& ( to Gradin"........................................................................................................................................1& I)*ortin" and +,*ortin" Tests.............................................................................................................1& Proctorin".............................................................................................................................................1& REPORTING................................................................................................................................................. ! Individ al -t dent Perfor)ance C.art.................................................................................................1/ Co)*arative (nalysis...........................................................................................................................1/ Individ al -t dent (ctivity Details.......................................................................................................1/ Perfor)ance.........................................................................................................................................1/ SCHEDULING AND ADMINISTRATION......................................................................................................... ! 0ser 'ana"e)ent.................................................................................................................................1/ Test -c.ed le 'ana"e)ent..................................................................................................................1/ +,tensibility..........................................................................................................................................11 ' ltilin" al..........................................................................................................................................11 %i". (vailability...................................................................................................................................11 2*en (PI based (rc.itect re...............................................................................................................11 +ase of C sto)i3ation..........................................................................................................................11 HOSTING .................................................................................................................................................... 7 ONLINE DELIVERY..................................................................................................................................... " TESTING CENTERS...................................................................................................................................... " OTHER UNIQUE ASSESSMENT TYPES......................................................................................................... " PROCTORING............................................................................................................................................... " MEASUREMENT THEORIES AND MODELS................................................................................................... 9 CLASSICAL TEST THEORY.......................................................................................................................... 9 Ite) (nalysis.........................................................................................................................................14 Assessment Solutions, %nterprise earnin& Solutions, NIIT t!
2
Assessment Solutions
Internal Consistency (nalysis...............................................................................................................20 Test Reliability (nalysis.......................................................................................................................20 Test 5alidity (nalysis...........................................................................................................................20 -tandardi3ation and 6or) -ettin"........................................................................................................21 ITEM RESPONSE THEORY #IRT$.................................................................................................................% 2ne7Para)eter 8o"istic 'odel............................................................................................................21 Two7Para)eter 8o"istic 'odel............................................................................................................21 T.ree7Para)eter 8o"istic 'odel.........................................................................................................21 TOOLS AND TECHNOLOGY.........................................................................................................................%% OTHER SERVICES.....................................................................................................................................22 CUSTOMI&ATION AND INTEGRATION..........................................................................................................%% TECHNICAL SUPPORT.................................................................................................................................%% HELPDESK...................................................................................................................................................%% MENTORING................................................................................................................................................%% TEST'PROCESS AUDITING..........................................................................................................................%% PRE'TEST SCREENING.................................................................................................................................%% SCHEDULING AND ADMINISTRATION.........................................................................................................%%
Assessment Solutions
%(ecuti)e Summar*
Strategy Assessment Design Test Item Development & Management Psychometric Services Technology Integration & Management Assessment Administration
Strategy for diagnostic assessment, formative tests and summative certification !ased on program needs 'lanning for difficulty levels, discrimination re,uirement, statistical plan Item !an" creation !ased on overall strategy, audience and e*posure si+e. Item sun setting strategies 3ield testing for Item characteristics, Test characteristics, relia!ility and validity analyses I1S 2TI compliant test !an". 3le*i!le or"flo s and advanced statistical analysis tools -ver ./00 physical locations for test delivery (osted Test engine, $ata management services
NIITs Assessment Solutions provides a complete range of offerings from strategy and design to implementation and administration to its customers in the Corporate, Education and Government segments. NIIT provides full solutions as ell as components of its offerings to various customers for !oth lo sta"es as ell as high sta"es tests. The products and services include# Assessment Strategy $esign % &lue'rinting Item Authoring Assessment Engine (osting $elivery )eporting and 'sychometric Analysis Scheduling, Administration, (elpdes", 'roctoring, Tech Support
Assessment Solutions
Intro!uction
Securit* Item Authorin& 5oste! %n&ine 0eli)er*1 A!min Ps*chometric Anal*sis Other Ser)ices
0esi&n BluePrintin& Authorin&1 -e)ie2in& 3iel! Testin& $alibration Item Ban4in&1 0eli)er*
Web7base! authorin&
-an!omi.ation
Online 0eli)er* Testin& $enters Proctorin& 8ultiple 9uestion T*pes Securit* Stan!ar!s $ompliance Securit*
Pre7Test Screenin&
Item Authorin&
Item Authorin&
Item authoring is a speciali+ed s"ill hich re,uires !oth 9 formal training as ell as systems and processes for the items to !e scientific and effective. 3or any test on any su!7ect, item authoring follo s a standard process. -n an ongoing !asis, over9e*posed and poor9performing items must !e retired and ne ones added. Test Creation follo s the process outlined !elo #
NIIT Test )evie % 3i*es
&lueprint Creation
Test Creation
Completed Test
)evie !y Client
%(am Ob6ecti)es
In this phase, the client provides o!7ectives that are identified according to the re,uired s"ill level to !e tested through the test. The o!7ectives are further split into specific outcomes of assessment items 5S-A6. The S-As specify the cognitive a!ility to !e assessed.
Blueprintin&
Ne*t, the NIIT Test Design Team does Blueprint Creation. A test designer creates a !lueprint ith the help of the instructors and an analyst and the program chair revie s the !lueprint. The !lueprint is a specification ta!le that determines the configuration of a test. It lays do n rules for the composition of the test. The !lueprint ensures test relia!ility and the definition of# E*am o!7ectives $ifficulty level for each test item Types of ,uestions, such as, multiple choice, se,uencing, or match the follo ing. 'ercentage distri!ution across various a!ility levels for each o!7ective
The !lueprint also ena!les the analyst and designer to decide the eight assigned to a topic or an S-A, hich in turn defines the num!er of test items to !e created for a topic=S-A. The eight assigned to a topic is decided according to the# Importance of a topic to measure the particular a!ility Importance of the topic in the conte*t of the overall assessment The eight assigned to a topic decides the relative importance of the topic and helps define the mar"s to !e allocated to each test item. The !lueprint is developed to ensure that# The eight given to each topic in each test is appropriate, so that the important topics are not neglected. This contri!utes to the validity of the test. The a!ilities tested are appropriate. 3or e*ample there are sufficient ,uestions re,uiring application and understanding of logical reasoning. Assessment Solutions, KSB, NIIT t!
:
Item Authorin&
8eight of topics and a!ilities are consistent from test to test> this contri!utes to the relia!ility of the test.
3inal Blue Print 3rom the !lueprint, applying the testing conditions derives a Test Configuration Ta!le. Testing conditions such as num!er of items in a test, time allo ed, ma*imum mar"s and mar"s assigned to a test item should !e determined after careful consideration. The ta!le consists of the# )andomi+ation strategy 9 )andomi+ation could !e !ased on item difficulty, e*am o!7ectives, or a com!ination of !oth. Item scoring details Negative scoring 5?=N6 Time allocation Num!er of ,uestions Cut score
Test $reation
Test Creation follo s !lueprint creation. This involves the actual riting of the test items and is done !y the NIIT Test Creation Team. To do this, the team# Identifies the difficulty level for each identified S-A !ased on the &looms Ta*onomy of cognitive a!ility Identifies the item type for each S-A !ased on the analysis and the difficulty level Creates each test item !ased on NIIT Test Item creation Standards and Guidelines. These guidelines are !ased on sound Instructional $esign principles and correct use of language.
-e)ie2in&
After the items have !een authored, each item must go through a series of rigorous revie s to eliminate errors, am!iguity and !iases of any "ind. NIIT -e)ie2 After Item Creation, items are revie ed and fi*ed in the NIIT Test Review and Fixes phase. Items undergo a rigorous revie process. Each test item is chec"ed against various parameters to ensure that the right a!ility is tested ith the right test item. -nly those items that clear the revie process are used in a test. )evie s are of the follo ing types# I$ )evie 4 Ensures items are in accordance ith Instructional $esign principles @anguage )evie 4 Ensures clarity of language Technical )evie 4 Ensures items are technically correct. -e)ie2 b* $lient The )evie phase is follo ed !y the Review by Client. In this step, the 'rogram Chair ould revie the items and identify any changes to the items. 3i*es 5if any6 as suggested !y the Client are made !y the NIIT Test Creation Team. The test is no ready for delivery.
3iel! Testin&
-nce the items are ready for deployment, they are put through a field test here a statistically significant num!er of test9ta"ers ho are representative of the final test9ta"er audience respond to all the items in a controlled environment. The results data collected from this round of testing is su!7ected to statistical analysis to assess the difficulty, discrimination and performance of the distractors. 'oor performing items are modified or dropped. Item 8aintenance
Item Authorin&
-ver time, !ased on e*posure and ho ell each item has performed in the tests, some items need to retired periodically and replaced !y ne items. This is an ongoing activity.
Test Item Authorin& # 'rovides or"flo !ased ,uestion authoring functionality ith a !uilt9 in multi9level revie mechanism. Test $on#i&uration# Allo s for generating e*am, !ased on specifications. Interfaces ith Authoring for specification and sends the generated item list to authoring. Test 0eli)er*# Allo s for launching individual candidate test, using the test pac" allocated for the candidate. Interfaces ith authoring and sends the results in a standard format. Per#ormance -eportin&# Interfaces ith all the components a!ove and generates various standard and custom reports for analysis. Test an! $an!i!ate A!ministration# Allo s for scheduling e*ams for individuals or !atches along ith proctor "eys.
The a!ove ,uestion types can !e presented to the user in different presentation styles. The assessment engine maintains t o sets of information for the ,uestions, one is the type of ,uestion it represents and the other is the corresponding presentation format. Additional ,uestion types can easily !e incorporated into the engine depending on re,uirements of the organi+ation=university. Practice an! $erti#ication Tests A Test can !e defined as a 'ractice Test or a Certification Test. 'ractice tests ena!le the user to chec" their understanding of the su!7ect and pursue remediation depending on the feed!ac". 3eed!ac" is displayed for each ,uestion in a practice test. Certification is an ac"no ledgment of the s"ills possessed !y an individual. It involves evaluating the s"ills a person possesses and providing a result along ith feed!ac" for the same. Tests can !e configured in !oth static and randomi+ed modes. In a static test, all candidates ta"ing the test get the same set of ,uestions. In a randomi+ed test, the ,uestion set presented to each candidate is uni,ue. The assessment engine can pic" up ,uestions for a test !ased on the test configuration from a large ,uestion !an". The system allo s creating sections ithin a test. Se,uences ithin a section can !e predetermined along ith specifications for distri!ution of ,uestions and online analysis !ased on statistics. It is also possi!le to configure sections so that a user is allo ed=not allo ed to move ahead to the ne*t section ithout successfully completing a section. -e)ie2 1 Wor4#lo2 Content is al ays revie ed !efore it is allo ed for pu!lication. C@iAS provides the 8or"flo module to facilitate the online revie and correction of content !efore it is pu!lished. 8hen a user creates a content piece such as an item or a test, the engine re,uires it to !e revie ed and approved through a defined, 5configura!le6 or"flo process !efore it is pu!lished. After the initial creation, the content ould then follo a specified path and go through the levels of revie =edits. At each level, other roles have specific 7o!s to perform on the item. )evie ers can send !ac" the item to the creator for changes. Alternately, items can !e for arded to the ne*t level revie er till their final approval. The 8or"flo module supports the ;)s, B)outes, )ules, and )oles. )oute is the path that content ta"es hile under going revie . The path has levels in it, hich are assigned to appropriate roles. )oles are the system roles, hich are assigned to act upon the content. )ules are the conditions specified hile setting up a or"flo cycle to define the decisions and actions that a revie er can ta"e on the content at different levels. process in C@iAS.
Test >eneration an! 0eli)er* 8hen a candidate starts a test, a test configuration present in the appropriate test pac" is randomly pic"ed up and assigned to the student. Thereafter, the test9ta"er is delivered the test using the same set of ,uestions. The system searches for ,uestions in the 2uestion 'ool depending upon the configuration of the test assigned to the candidate. The system processes ,uestions in !atches from the generated ,uestions pool and displays them on the screen. The system displays the first set of ,uestions as soon as the Testing Engine selects them. 8hile the test9ta"er attempts these ,uestions, the engine "eeps selecting further ,uestions and caching them. 3urther ,uestions are displayed to the test9ta"er from this cache. This improves the performance, as the test9ta"er need not ait till the engine has selected all the ,uestions for the test. Also, the system need not ait for the student to attempt the displayed ,uestion to fetch the ne*t ,uestion from the ,uestion pool.
Test Instruction The Test starts ith a set of instructions. The figure !elo , displays the attri!utes=parameters of the test as they appear on the instructions screen .
9uestion Pa&es Each ,uestion in the assessment is displayed on its o n page, and each ,uestion page includes the follo ing !uttons# -e)ie2# To revie and revise all ,uestions in the section. 1a"e sure that all ,uestions are DAttemptedE !efore pressing the %n! Assessment !utton. Section ist# To move to the ne*t section, once you have attempted all ,uestions in the current section. Instructions? To return to the initial instructions page. Pre)ious 9uestion# To go !ac" to the previous ,uestion. Submit Ans2er# To su!mit your ans er and go to the ne*t ,uestion. S4ip 9uestion# To s"ip the current ,uestion and go on to the ne*t ,uestion. Fn9attempted ,uestions ill !e mar"ed incorrect, so ma"e sure that you attempt all the ,uestions !efore clic"ing %n! Assessment. ?ou can revie and revise all ,uestions in this section !y clic"ing -e)ie2 9uestions. %n! Assessment? To !e used hen you have revie ed, ans ered and su!mitted an ans er for every ,uestion of the assessment. ?ou can confirm that all the ,uestions have !een attempted !y clic"ing the D)evie SectionE !utton.
9uestion T*pes
The Assessment Engine has the a!ility to incorporate a diverse set of ,uestions that ould meet the re,uirements of most of the Instructional $esign e*perts. 3ollo ing ,uestion types are supported in the system through readily availa!le templates# 1ultiple Choice Single Select 1ultiple Choice 1ultiple Select True and 3alse 3ill in the &lan"s 5Te*t and Numeric6 1atch the 3ollo ing 3ree te*t= Essay Type 5Su!7ective response6
-e)ie2 9uestions The )evie 2uestions page, as sho n in the follo ing figure, displays all the ,uestions and indicates hether or not an ans er has !een su!mitted for each ,uestion. This page displays the status of all the attempts made=not made !y the learner, and allo s him to "eep trac" of progress during the assessment.
3ee!bac4 an! 5ints The assessment engine ena!les a test creator to provide feed!ac" on the responses of the learners. The feed!ac" corrects the learners and reinforces the concept. The assessment engine also ena!les a test creator to provide hints to the learners hile the learner is ta"ing a test. In a certification test, it is possi!le to penali+e learners if they ma"e use of the hints. 8ar4s $on#i&uration This ena!les a test creator to configure mar"s at the time of test creation. The learners can either !e mar"ed in grades or percentage. The test creator can configure negative mar"s for each rong response, full mar"s or no mar"s for a particular ,uestion, and the mar"s can !e !ased on the rendering type of the ,uestion or on its difficulty level. This gives fle*i!ility for the mar"ing system and discretionary po er to the test creator. Auto >ra!in& A test creator can select auto grading of the test. In such a case, the test creator need not configure the mar"ing at the time of creating a test. The learner ill !e automatically graded according to the default mar"s set in the system. Importin& an! %(portin& Tests The Assessment engine is I1S and 2TI compliant. (ence tests and ,uestions can !e easily imported from and e*ported to other systems that are standards compliant, promoting reuse of e*isting content. Proctorin& In a certification test, the candidates need to ta"e the test in a controlled environment here their identity is co9signed !y the invigilating authority assigned for the test. The delivery engine has an interface for the proctor to co9sign candidates for the test. The proctors are assigned to the tests hen the test is !eing scheduled. C@iAS provides the facility to proctor the test in the follo ing mechanisms# Individual Co9signing In this form of proctoring, the assessment engine launches a screen to accept the login credentials as the candidates start a test. This is applica!le only on the tests here proctoring has !een ena!led hile configuring it. Another mechanism of individual cosigning is !y hich an invigilator is provided a uni,ue random "ey for every e*amination. The invigilator can rite or announce the "ey to the students in the e*amination hall. Assessment Solutions, %nterprise earnin& Solutions, NIIT t!
"+
1ass Co9signing or Group 'roctoring Fsing this mechanism a proctor can co9sign a group or !atch of students attempting a test from a central location using an interface provided !y C@iAS. This interface provides a screen that displays the students ta"ing an assessment. After the proctor has verified the student, all the students can !e selected and co9signed.
-eportin&
The purpose of 'erformance Trac"er is to generate various reports to ena!le analysis of performance, evaluate the outcomes of students learning and, their assessments. It is also possi!le to generate administration reports for monitoring the usage and details of "ey information setup in the system. In!i)i!ual Stu!ent Per#ormance $hart This report sho s the performance of a Bsingle student in a graphical format. The report sho s the score of a student in various tests. This report ena!les to vie and compare the performance of a student across various tests in a module. The intended end user of the report is the teaching staff, ho can revie the performance of a student in their o n !atches. $omparati)e Anal*sis This report sho s the trend of the !atch for test in a graphical format. The report sho s ho the students are performing in a particular test. This report ena!les to vie and 7udge the average scoring a!ility of candidates. The report presents the score in !loc"s of :0 along the *9a*is and the percentage of candidates ho fall under each !loc" along the y9a*is. In!i)i!ual Stu!ent Acti)it* 0etails This report lists the details of activities for a candidate in a !atch in a ta!ular format. The intended end user of the report is the teaching staff. The teaching staff can revie the details of online tests ta"en !y students. The report allo s the teaching staff to choose a student from their !atches and vie the process. Per#ormance C@iAS can support large num!er of concurrent users, ena!les streaming of digital content, supports concurrent streams, and manages !and idth using defined policies and priorities.
The tests, after !eing configured and pu!lished can !e delivered to the candidates using the delivery engine. The delivery engine receives the candidates responses and evaluates the results. There results are stored as a part of the studentCs assessment records. An important feature of assessment engine is its a!ility to "eep trac" of individual ,uestion papers. A learner can a!andon a test mid ay and can resume it at a later date. The engine "eeps trac"s of the ,uestions given to the learner in the incomplete test and the options that had !een mar"ed !y learner. Caching is another important feature of the -nline Testing Engine. 2uestions and their attri!utes are do nloaded and cached in the local machine hile the person is ans ering the first ,uestion. This process is done in !ac"ground ithout affecting the display. This improves system performance, as the learner need not ait for the ,uestions to !e do nloaded. Time for the test is calculated as real time and it does not include the do nload time. This means that lo or high !and idth does not affect the time allo ed to ta"e the test. %(tensibilit* C@iAS is a component9!ased system, hich allo s a C@iAS component to !e replaced ith any e*ternal system availa!le in the mar"et. The system is open to learning, colla!orative, and "no ledge !ase tools developed !y other vendors. This is achieved !y the implementation of open A'Is. Any e*ternal system can interface ith the e*isting components using the A'Is provided for the component. 8ultilin&ual C@iAS supports multi!yte FT39G character set that allo s it to !e used for most of the languages in the orld, including Asian languages, such as li"e Hapanese, Chinese, and Aorean. The data stored in the data!ase, field, la!els, and error messages can !e multilingual. 5i&h A)ailabilit* The scala!le architecture of C@iAS allo s configuration for using more than one Application servers and 8e! servers. In such a case, the hole application ill !e availa!le to users even hen one of the servers is not availa!le. Open API base! Architecture C@iAS follo s the open A'I !ased architecture, hich allo s easy integration ith e*ternal systems in an enterprise. The ro!ust design enforces that all the programs in any module can access the data of their o n modules. 3or accessing the data of any other modules the pu!lished A'Is are used. This allo s any other application system to get=pass data to C@iAS modules !y calling the pu!lished A'Is. This allo s rapid integration ith e*isting applications in the organi+ation. The system is also compati!le ith @$A' standards allo ing it to !e integrated standard application to achieve Single Sign -n through an enterprise portal. ith any other
%ase o# $ustomi.ation C@iAS allo s easy customi+ation !ased on organi+ational needs. All static te*t in the system 5screen titles, field la!els, error messages, etc.6 is retrieved from a multilingual file. This allo s for rapid modifications to screen names, field names, and other FI elements, thus eliminating the need for any re9programming.
C@iAS is a rule9!ased system, here many rules can !e defined for each function at the time of configuration. This once again offers tremendous fle*i!ility during implementation ithout any need for re9programming. The loo"9and9feel and other changes in organi+ational policies can !e effected in an e*tremely short duration. Technical o)er)ie2 o# the $ iKS Assessment %n&ine? H.EE Application server 5'ramati6 -racle )$&1S Scala!le, load !alancing ready, multilingual architecture. 3ully Internet !ro ser !ased 'orta!le on all -S including @inu* 5)ed (at6 Allo s ;9tiered implementation here e! server, application server % data !ase servers can !e separated !y fire alls.
5ostin&
NIITs offers a fully hosted and managed e@earning services environment for production facility and appropriate infrastructure necessary for your technical re,uirements and !usiness strategic goals. NIIT offers scala!ility, security and transparency of a hosted e@earning infrastructure, ithout re,uiring investment in hard are and soft are evaluation, deployment and maintenance. 3eatures? II./J Server uptime guarantee .<*K application and systems monitoring System, data!ase and net or" administration &ac"9ups )edundant and high resilience Internet connectivity Security 3ully managed staging environment )apid application deployment Scala!le storage Application Setup NIIT ta"es care of the installation of (ard are, soft are and the application. NIIT ta"es a ay your orry to (ire, train any additional IT staff to setup the soft are or configure the hard are. A)ailabilit* NIIT provides II./J uptime for your e@earning application ith an a!ility to continue uninterrupted services.
Test 0eli)er*
Online 0eli)er*
NIIT provides delivery of online tests to a num!er of its education and corporate customers around the orld. These tests are scheduled and delivered at locations of choice of the customer. These tests are used for a variety of purposes# Education# :. Entrance into a program .. As part of courses, to aid in the learning process ;. -ngoing evaluation in various courses <. 3inal e*ams=graduation Corporate :. 3or hiring ne employees .. Assessing the impact of training programs ;. 're9promotion assessment of a!ilities and !ehavioral aspects
Testin& $enters
NIIT is setting up a net or" of dedicated testing centers across India to conduct tests in large num!ers to manage scale for its customers. These centers, !esides providing a large facility in various cities for conducting over :,000 tests per day each also have other features for doing pre9 assessment screening and post9test intervie ing. The testing centers also have the capa!ility for conducting various ne types of assessments including automatically assessing language, voice and accent s"ills through patent9pending applications developed !y NIIT.
Proctorin&
NIIT provides a proctored environment for conducting high9sta"es assessments in its dedicated testing centers. These centers are e,uipped ith closed9circuit cameras and physical proctoring !y trained proctors.
";
Assessment Solutions
Ps*chometric Anal*sis
'sychometric Analysis is an integral part of NIITs assessment solution. 'sychometric analysis is performed at t o levels# After field testing# in order to evaluate item performance and for 5re6cali!rating items for difficulty. This forms an integral part of item authoring process. -n an ongoing !asis# to continuously evaluate performance of tests and individual items ithin the tests.
'oint9&iserial and E*treme Group 1ethods are commonly used approaches to arrive at the $iscrimination Inde* for individual items. &esides providing a detailed reports on discriminative po er of individual items, interpretative comments on the psychometric implications of discrimination indices of each item are shared ith test developers to facilitate the test revie process. 0istractor %##icienc* Anal*sis In the case of tests that use multiple9choice items, the incorrect ans er choices to a ,uestion 5distractors6 play an important role. A DgoodE distractor, hich is unam!iguously incorrect and yet can confuse the less "no ledgea!le test ta"er, adds to the discrimination value of the ,uestion. The effectiveness of distractors is analy+ed !y measuring the distri!ution of responses across all the distractors. $istractors that are not selected at all or often enough !y test9ta"ers may !e discarded= modified= replaced to improve the efficiency of the item. 'oint9&iserial 1ethod is used to compare !et een the group that chooses a distractor and the group that chooses the correct option. A distractor efficiency matri* is prepared for all items in a test and recommendations provided to the item developers. Internal $onsistenc* Anal*sis Items in a test must have internal consistency in measuring the proposed construct or varia!le. That is, items that are chosen for a test, designed to measure a particular a!ility or trait, must assess only that a!ility or trait and, therefore, have high correlation among themselves and ith the test. In effect, information from such analysis is essential to "no the internal structure of the test and to ma"e decisions on the need to further enhance the ,uality of the test. 8ith high internal consistency indices, the test developer can confidently rely on the tests a!ility to assess the proposed a!ility or trait. The internal consistency of a test is measured !y means of different statistical tools. Common tools used to analy+e internal consistency are Cron!achs Alpha Techni,ue, A).0 Coefficient and Spearman9&ro n Alpha. Test -eliabilit* Anal*sis A good test needs to !e consistent in its performance. That is, the same test given to a candidate today should produce identical results a fe ee"s or months later hen the candidate ta"es it again. In a similar ay, t o or more parallel tests that assess the same a!ility or trait must sho similar=identical results. In the psychometric analysis of a!ility=personality=s"ills tests, t o different methods are used to evaluate the relia!ility of tests 9 Test9)etest )elia!ility analysis and Alternate93orm or Split9(alf )elia!ility Analysis. Test /ali!it* Anal*sis It is a!solutely essential that a test measure hat it as originally supposed to measure. At various stages of its development, a test needs to !e evaluated for its validity. Lalidity of a test is assessed in different ays# 3ace validity, content validity, concurrent validity, predictive validity, and construct validity.
Stan!ar!i.ation an! Norm Settin& The process of standardi+ation of a test ensures the representativeness of the test to the target audiences. It ena!les the test to !e administered and scored under uniform conditions so that the test can produce compara!le results across different situations and target audiences. As part of the process, norms and !enchmar"s for different groups and situations are set for an un!iased comparison of individual scores on the test. According to the specific re,uirements of the customer, the team of psychometricians create standard rules for the administration, scoring and interpretation of the test. Standardi+ed scores, such as percentiles, T9scores and STEN scores, and group9specific comparison norms or !enchmar"s are developed for accurate interpretation of individual test scores.
Other Ser)ices
NIIT provides a range of other services related to assessments to its customers around the orld.
Technical Support
NIIT provides technical support to its customers and test delivery partners to help resolve any technical ,uery.
5elp!es4
Since many test9ta"ers are still ta"ing online tests for the first time, they may need some hand9 holding. NIIT has .<*K toll9free helpdes"s that provide this service to our test sponsors.
8entorin&
8hen tests are used as a practice or preparation mechanism, test9ta"ers need to discuss their performance ith someone ho is familiar ith !oth 4 the test items as ell as the su!7ect area. NIIT provides this service to several of its customers through phone, email and chat.
Test7Process Au!itin&
Some of our customers use our technology to deliver certification tests through a net or" of their partners=franchisees. In such cases NIIT provides a service to audit the test delivery process through scheduled visits as ell as through mystery shoppers at test delivery locations. This helps our customers to maintain a high level of integrity in their testing process.
Pre7test Screenin&
3or certain "inds of tests, e.g., pre9hire assessments, test sponsors may have a set of criteria for screening candidates for eligi!ility for the test. NIIT provides !oth 9 manual as ell automated options for screening.