You are on page 1of 26

SOFTWARE TESTING-XSE471 IMPORTANT TWO MARK QUESTIONS FOR FIVE UNITS: 1) Define s f!"#$e !es!

in% Software testing determine whether work is progressing according to the plan and whether the expected results are obtained. Check for performance of the set procedures , changes in conditions, or abnormalities that may appear. &) W'#! #$e !'e ( )* nen!s in+ ,+e- in s f!"#$e -e+e, *)en! *$ (ess Plan: Devise a plan Do: xecute the plan Check: Check the results !ct: "ake the necessary action .) W' #$e !'e *#$!ies #ss (i#!e- "i!' !es!in% Software Customer Software #ser Software tester $nformation technology management Senior organi%ation management !uditor 4) E/*,#in !'e $ ,e f s f!"#$e (0s! )e$ #n- s f!"#$e 0se$ S f!"#$e 10s! )e$: "he party or department that contracts for the software to be developed. S f!"#$e Use$: "he individual or group that will use the software once it is placed into production. 2) E/*,#in !'e $ ,e f s f!"#$e !es!e$ #n- inf $)#!i n !e('n , %3 )#n#%e)en! S f!"#$e Tes!e$: "he individual or group that performs the check function on the software Inf $)#!i n Te('n , %3 M#n#%e)en!: "he individual or group with responsibility for fulfilling the information technology mission. "esting supports fulfilling that mission. 4) E/*,#in !'e $ ,e f Seni $ $%#ni5#!i n )#n#%e)en! #n- #0-i! $ Seni $ O$%#ni5#!i n M#n#%e)en!: "he C & of the organi%ation and other senior executives who have the responsibility of fulfilling the organi%ation mission. $nformation technology is an activity that supports fulfilling that mission. A0-i! $: &ne or more individuals having the responsibility to evaluate the effectiveness, efficiency, and the ade'uacy of controls in the information technology area. "esting is considered a control by the audit function.

7) W'#! is # -efe(! ! defect is a variance from a desired product attribute. "esters look for defects. "here are two categories of defects. () Defect from a product specification *) +ariance from customer,user expectation Defects generally fall into one of the following three categories () -rong *).issing /) xtra 6) E/*,#in !'e !'$ee (#!e% $ies f -efe(!s Defects generally fall into one of the following three categories ( 1) W$ n%: "he specifications have been implemented incorrectly. "his defect is a variance from customer , user specification. &) Missin%: ! specified or wanted re'uirement is not in the built product. "his can be a variance from specification, an indication that the specification was not implemented, or a re'uirement of the customer identified during or after the product was built. .) E/!$#: ! re'uirement incorporated into the product that was not specified. "his is always a variance from specification. 7) W'#! is 8i+i#! ('#$! 0or each category in the kiviat chart, the number of yes responses should be counted and a dot should be placed on the kiviat chart on the line representing the number of yes responses. Similarly for all the categories it is necessary to put the dot . "he dots are then connected by a line resulting in what is called foot print of the status of the software organi%ation. 19) Define $is8 ! risk is a condition that can result in a loss. $t is not possible to eliminate risks, but it is possible to reduce their occurrence or impact of the loss. xample: "ypes of strategic risk associated with the development and installation of a computer system can be: () $ncorrect results will be produced *) #nauthori%ed transactions will be acceptable by the system /) Computer file integrity will be lost. 1) System will be difficult to operate

11) Define !es!in% * ,i(3 ! testing policy is management definition of testing for a department. ! testing policy involves the following criteria 1) Defini!i n f !es!in%: ! Clear, brief, and unambiguous definition of testing. &) Tes!in% S3s!e): "he method through which testing will be achieved and enforced. .) E+#,0#!i n: 2ow information services management will measure and evaluate testing. 4) S!#n-#$-s: "he standards against which testing will be measured. 1&) W'#! #$e !'e ,ife (3(,e *'#ses in+ ,+e- in s!$0(!0$e- #**$ #(' in !es!in%: () 3e'uirement phase *) Design phase /) Program phase 1) "est phase 4) $nstallation phase 5) .aintenance phase 1.) Define !es! s!$#!e%3 #n- e/*,#in !'ei$ ( )* nen!s: "he strategies address the risks and present a process that can reduce those risks. "he two components of the testing strategy are the test factors and the test phase. Tes! F#(! $: "he risk or issue that needs to be addressed as part of the test strategy. "he strategy will select those factors that need to be addressed in the testing of a specific application system. E:%:: Correctness 0ile $ntegrity !uthori%ation !udit "rail Tes! P'#se: "he phase of the system development life cycle in which testing will occur. 14) E/*,#in !'e !es! f#(! $ f M#in!#in#;i,i!3< P $!#;i,i!3 #n- A0-i! !$#i,: M#in!#in#;i,i!3: "he effort re'uired to learn, operate, prepare input for and interpret output from the system. rror is used in the broad context to mean both a defect in the system and a misinterpretation of ser re'uirements.6 P $!#;i,i!3: "he effort re'uired to transfer a program from one hardware configuration and,or software system environment to another. "he effort includes data conversion, program changes, operating system, and documentation changes. A0-i! T$#i,: "he capability to substantiate the processing that has occurred. "he processing of data can be supported through the retention of sufficient

evidential matter to substantiate the accuracy, completeness, timeliness, and authori%ation of data. "he process of saving the supporting evidential matter is fre'uently called an audit trail. 12) E/*,#in !'e !es! f#(! $ f 1 0*,in% #n- E#se f *e$#!i n 1 0*,in%: "he effort re'uired to interconnect components within an application system and with all other application systems in their processing environment. E#se f O*e$#!i n: "he amount of effort re'uired to integrate the system into the operating environment and then to operate the application system. 14) W'3 #$e -efe(!s '#$- ! fin-= 0inding defects in a system is not easy. Some are easy to spot, others are more subtle. "here are two reasons defects go undetected: 1) N ! , 8in%: "ests often are not performed because a particular test condition was unknown. !lso, some part s of a system go untested because developers assume software changes don7t affect them. &) > 8in%< ;0! n ! seein%: Sometimes developers become so familiar with their system that they overlook details, which is why independent verification and validation is sed to provide a fresh viewpoint. 17) >is! 0! !'e * ssi;,e (i$(0)s!#n(es !'#! -efe(!s #$e !3*i(#,,3 f 0n- in s f!"#$e s3s!e)= (. $" improperly interprets re'uirements. *. "he users specify the wrong re'uirements. /. "he re'uirements are incorrectly recorded. 1. "he design specifications are incorrect. 4. "he program specifications are incorrect. 5. "here are errors in program coding. 8. "here are data entry errors. 9. "here are testing errors. :. "here are mistakes in error correction. (6. "he corrected condition causes another defect. 16) ? " ! $e-0(e !'e ( s! f !es!in%= "he cost of defect identification and correction increases exponentially as the pro;ect progresses. So, testing should begin during the first phase of the life cycle and continue throughout the life cycle. "herefore life cycle testing is essential to reduce the cost of testing. 17) W'#! #$e +e$ifi(#!i n #n- +#,i-#!i n= Ve$ifi(#!i n: ! tester uses verification methods to ensure the system <software, hardware, documentation, and personnel) complies with an organi%ation7s standards and processes, relying on review or non executable methods.

+erification answers the 'uestion, =Did we build the right system>? V#,i-#!i n: $t physically ensures that the system operates according to plan by executing the system functions through a series of tests that can be observed and evaluated. +alidation addresses, =Did we build the system right>? &9) Define f0n(!i n#, !es!in% 0unction testing is sometimes called as black box testing because no knowledge of the internal logic of the system is used to develop test cases. E/#)*,e: $f a certain function key should produce a specific result when pressed, a functional test would be to validate this expectation by pressing the function key and observing the result. &1) Define s!$0(!0$#, !es!in% Structural testing is sometimes called white box testing because knowledge of the internal logic of the system is used to develop hypothetical test cases. $f a software development team creates a block of code that will allow a system to process information in a certain way, a test team would verify this structurally by reading the code, and given the system7s structure, if the code could work reasonably. &&) Define " $8;en(' ( n(e*! "he workbench is a way of illustrating and documenting how a specific activity is to be performed. "here are four components of each workbench: In*0! : "he entrance criteria P$ (e-0$e ! - : "he work tasks or processes that will transform the input into the output. P$ (e-0$e ! ('e(8: "he processes that determine that the output meets the standards. O0!*0! : "he exit criteria or deliverables produced from the workbench &.) W'#! #$e !'e ei%'! ( nsi-e$#!i ns in -e+e, *in% !es!in% )e!' - , %ies "he ob;ective of eight considerations is to provide the frame work for developing the testing tactics. "he eight considerations are listed below: () !c'uire and study the test strategy *) Determine the type of development pro;ect /) Determine the type of software system 1) Determine the pro;ect scope 4) $dentify the tactical risks 5) Determine when testing should occur. 8) @uild the system test plan 9) @uild the unit test plan

&4) E/*,#in !'e ( n(e*! f #**,i(#!i n fi! 0it is a concept that implies how usable, helpful, and meaningful the application is in the performance of the day to day function. "he four components of fit is () D#!#: "he reliability, timeliness, consistency, and usefulness of the data included in the automated application to the user. *) Pe *,e: "he skills, training, aptitude, and desire to properly use and interact with the automated application. /) S!$0(!0$e: "he proper development of application systems to optimi%e technology and satisfy re'uirements. 1) R0,es: "he procedures that are to be followed in processing the data. &2) Define -3n#)i( !es!in% Dynamic analysis re'uires that the program be executed and hence involves the traditional notion of programming testing. "hat is, the program is run on some test cases and the results of the program7s performance are examined to check whether the program operated as expected. &4) Define s!#!i( !es!in% Static analysis does not usually involve actual program execution. Common static analysis techni'ues include such tasks as syntax checking. &7) >is! 0! !'e s!$0(!0$#, s3s!e) !es!in% !e('ni@0es "he structural system testing techni'ues are Stress "esting xecution "esting 3ecovery "esting &perations "esting Compliance "esting Security "esting &6) Define S!$ess Tes!in% Te('ni@0e Stress testing is designed to determine if the system can function when sub;ect to large volume s A larger than would be normally expected. .g.: "he areas that are stressed include input transactions, internal tables, disk space, output, communications, computer capacity etc &7) Define E/e(0!i n Tes!in% Te('ni@0e xecution testing is designed to determine whether the system achieves the desired level of proficiency in a production status. xecution testing can verify response times, turn around times, as well as design performance. .9) Define Re( +e$3 Tes!in% Te('ni@0e

3ecovery is the ability to restart operations after the integrity of the application has been lost. "he process normally involves reverting to a point where the integrity of the system is known, and then reprocessing transactions up until the point of failure. "he time re'uired to recover operations is affected by the number of restart points, the volume of applications run on the computer center, the training and skill of the people conducting the recovery operation, and the tools available for recovery. .1) Define O*e$#!i ns Tes!in% Te('ni@0es &perations "esting is designed to verify prior to production that the operating procedures and staff can properly executes the application. .&) Define 1 )*,i#n(e Tes!in% Te('ni@0es Compliance "esting verifies that the application was developed in accordance with information technology standards, procedures, and guidelines. "he methodologies are used to increase the cost, and to increase the maintainability of the application system. ..) Define Se(0$i!3 Tes!in% Te('ni@0es Security is a protection system that is needed for both secure confidential information and for competitive purposes to assure third parties that their data will be protected. Security testing is designed to evaluate the ade'uacy of the protective procedures and countermeasures. .4) >is! 0! !'e f0n(!i n#, s3s!e) !es!in% !e('ni@0es "he types of techni'ues useful in performing functional testing include 3e'uirements testing 3egression testing rrorBhandling testing .anual support testing $ntersystem testing Control testing Parallel testing .2) Define Re@0i$e)en! !es!in% 3e'uirements testing must verify that the system can perform its function correctly and that the correctness can be sustained over a continuous period of time. #nless the system can function correctly over an extended period of time, management will not able to rely upon the system. .4) Define Re%$essi n !es!in% &ne segment of the system is developed and thoroughly tested. "hen a change is made to another part of the system, which has a disastrous effect on the thoroughly tested portion. ither the incorrectly implemented change causes a problem, or the change introduces new data or parameters that cause problems in a previously tested segment.

.7) Define E$$ $-?#n-,in% !es!in% rrorBhandling testing should test the introduction of the error, the processing of the error, the control condition , and the reentry of the condition properly corrected. "his re'uires errorBhandling testing to be an iterative process in which errors are first introduced into the system, then corrected, then reentered into another iteration of the system to satisfy the complete errorBhandling cycle. .6) Define M#n0#,-S0** $! Tes!in% Te('ni@0e .anualBsupport testing involves all the functions performed by people in preparing data for and using data from automated applications. Specific ob;ectives of manualBsupport testing include: +erifying that the manualBsupport procedures are documented and complete Determining that manualBsupport responsibility has been assigned Determining that the manualBsupport people are ade'uately trained Determining that the manualBsupport and the automated segment are properly interfaced. .7) Define In!e$-S3s!e) Tes!in% Te('ni@0e !pplication systems are fre'uently interconnected to other specification systems. "he interconnection may be data coming into the system from another application, leaving for another application. 0re'uently multiple applications A sometimes called cycles or functions. $ntersystem testing is designed to ensure that the interconnection between application function correctly. 49) Define 1 n!$ , Tes!in% Te('ni@0e &neBhalf of the total system development effort is directly attributable to controls. Controls include data validation, file integrity, audit trail, backup and recovery, documentation, and the other aspect of systems related to integrity. "he control testing techni'ue is designed to ensure that the mechanisms that oversee the proper functioning of an application system work. 41) Define P#$#,,e, Tes!in% Te('ni@0e Parallel testing re'uires that the same input data be run through two versions of the same application. Parallel testing can be done with the entire application or with a segment of the application. 3unning the old version of the application system to ensure that the operational status of the old system has been maintained in the event that problems are encountered in the new application: 4&) Tes!in% T ,s:

A((e*!#n(e Tes! 1$i!e$i#: "he development of system standards that must be achieved before the user will accept the system for production purposes. A 0n-#$3 V#,0e An#,3sis: ! method of dividing application systems into segments so that testing can occur within the boundaries of those segments.

1#0se-Effe(! G$#*'in%: "he ob;ective is to reduce the number of test conditions by eliminating the need for multiple test that all produce the same effects. 1'e(8,is!: ! series of probing 'uestions designed for use in reviewing a predetermined area or function. 1 -e 1 )*#$is n: $dentifies differences between two versions of the same program. 1 )*i,e$-;#se- An#,3sis: Diagnostic routines are added to a compiler to identify program defects during the compilation of the program. 1 )*,e/i!3-;#se- Me!$i( Tes!in%: #ses statistics and mathematics to develop highly predictive relationships that can be used to identify the complexity of computer programs and the completeness of testing in evaluating the complex logic. 1 nfi$)#!i nBE/#)in#!i n: +erifies the correctness of many aspects of the system by contacting third parties, such as users, or examining a document to verify that it exists 1 n!$ , F, " An#,3sis: 3e'uires the development of a graphic representation of a program to analy%e the branch logic within the program to identify logic problems. 1 $$e(!ness P$ f: 3e'uires a proof hypothesis to be defined and then used to evaluate the correctness of the system. 1 +e$#%e-;#se- Me!$i( Tes!in%: #ses mathematical relationship to show what percent of the application system has been covered by the test process. "he resulting metric should be usable for predicting the effectiveness of the test process. D#!# Di(!i n#$3: Cenerates test data to verify data validation programs based on the data contained in the dictionary. D#!# F, " An#,3sis: ! method of ensuring that the data used by the program has been properly defined, and the defined data is properly used. Desi%n-;#se- F0n(!i n#, Tes!in%: 3ecogni%es that functions within an application system are necessary to support the re'uirements. "his process identifies those designBbased functions for test purposes. Desi%n Re+ie"s: 3e'uires reviews at predetermined points throughout systems development in order to examine progress and ensure the development process is followed. Des8 1'e(8in%: 3eviews by the originator of the re'uirements, design or program as a check on the work performed by that individual. Dis#s!e$ Tes!: "est group simulates the disaster or system failure to determine if the system can be correctly recovered after the failure. E$$ $ G0essin%: #ses the experience or ;udgment of people to Predetermine through guessing what the most probable errors will be and then test to ensure whether the system can handle those test conditions. E/'#0s!i+e Tes!in%: Performs sufficient testing to evaluate every possible path and condition in the application system.

F, "('#$!: Craphically represents the system and,or program flow in order to evaluate the completeness of the re'uirements, design, or program specifications. Ins*e(!i ns: 3e'uires a stepBbyBstep explanation of the product with each step checked against a predetermined list of criteria. Ins!$0)en!#!i n: .easures the functioning of a system structure by using counters and other monitoring instruments. In!e%$#!e- Tes! F#(i,i!3: Permits the integration of test data in a production environment to enable testing to run during production processing. M#**in%: $dentifies which part of a program is exercised during a test and at what fre'uency. M -e,in%: Simulates the functioning of the environment or system structure in order to determine how efficiently the proposed system solution will function. P#$#,,e, O*e$#!i n: +erifies that the old and new version of the application system produce e'ual or reconcilable results. P#$#,,e, Si)0,#!i n: !pproximates the expected results of processing by simulating the process to determine if test results are reasonable. Pee$ Re+ie": Provides an assessment by peers<computers) of the efficiency, adherence to standards etc. of the product, which is designed to improve the 'uality of the product. Ris8 M#!$i/: Produces a matrix showing the relationship between the system risk, the segment of the system where the risk occurs, and the presence or absence of controls to reduce that risk. S1ARFCS3s!e) 1 n!$ , A0-i! Re+ie" Fi,e): @uilds a history of potential problems in order to compare problems in a single unit over a period and,or compares like units. S( $in%: "he process can be used to determine the degree of testing <for example, highBrisk systems would be sub;ect to more tests than lowBrisk systems) or to identify areas within the application system to determine the amount of testing needed. Sn#*s' !: ! method of printing the status of computer memory at predetermined points during processing. Computer memory can be printed when specific instructions are executed. S3s!e) > %s: #ses information collected during the operation of a computer system for analysis purpose to determine how well the system performed. "he logs used are those produced by operating software such as database management systems, operating systems, and ;ob accounting systems. Tes! D#!#: System transactions that are created for the purpose of testing the application system. Tes! D#!# Gene$#! $: Software systems that can be used to automatically generate test data for test purposes. T$#(in%: ! representation of the paths followed by computer programs as they process data. U!i,i!3 P$ %$#)s: !naly%es and prints the result of a test through the use of a generalBpurpose program.

V ,0)e-Tes!in%: $dentifies system restriction<e.g $nternal table si%e) and then creates a large volume of transactions designed that exceed those limits. W#,8!'$ 0%'s: ! process that asks the programmer or analyst to explain the application system to a test team by executing the application system. "he ob;ective of the walkthrough is to provide a basis for 'uestioning by the test team as a basis of identifying defects.

4.) W'#! #$e !'e s!e*s in+ ,+e- in se,e(!in% #n- 0sin% !'e !es! ! ,s 0our steps are involved in selecting and using the test tools. Step (: .atching the tool to its intended use. Step *: Selecting a tool appropriate to the life cycle phase in which it will be used. Step /: .atching the tool to the skill level of the tester Step 1: Selecting an affordable tool. 44) >is! 0! !'e +#$i 0s s8i,,s nee-e- ;3 !'e !es!e$ "he individual performing the test must select a tool that conforms to his or her skill level. "he skills are divided into user skill, programming skill, system skill and technical skill. Use$ S8i,,: Skills needed include general business speciali%ing in the area computeri%ed, general management skills, and a knowledge of identifying and dealing with user problem. P$ %$#))in% S8i,,: Programming Skill includes understanding of computer concepts, flowcharting, programming in the languages, debugging and documenting computer programs. S3s!e) S8i,,: 3e'uires the ability to translate user re'uirements into computer system design specifications, flowcharting, problem analysis, design methodologies, computer operations, some general business skills, programming skills, error identification and analysis in automated applications and pro;ect management. Te('ni(#, S8i,,: System programming, database administration, statistics, accounting, and operating software packages.

42) W'#! #$e!'e ;De(!i+es f #** in!in% )#n#%e$s f $ !es!in% ! ,s &b;ective(: Create a source of competency about how to use the tool. &b;ective *: !ssign someone accountable to oversee tool usage. -ithout someone accountable to ensure tools are properly used, tools may fall into disuse. &b;ective /: Provide a training ground for future managers. 44) W'#! #$e !'e s!e*s in+ ,+e- in #** in!in% )#n#%e$s f $ !es!in% ! () "ool .anager Selection *) !ssign the "ool .anager Duties /) Dimiting the "ool .anager7s "enure ,s

47) E/*,#in #; 0! !'e !" !3*es f ( )*0!e$ !es!in% "here are two general categories of testing A PreBimplementation and PostBimplementation testing. P$e-i)*,e)en!#!i n Tes!in%: !ctivities that occur prior to placing the application system in an operational status. "he ob;ective of preB implementation testing is to determine that the system functions as specified and that defects in the system are removed prior to placing the system into production. P s!-i)*,e)en!#!i n Tes!in%: In this type, testing occurs after the system goes into operation and is normally considered part of systems maintenance. 46) E/*,#in #; 0! !'e ( s! f ( )*0!e$ !es!in% "he cost of computer testing includes both preBimplementation cost and postBimplementation cost. "he cost of removing system defects prior to the system going into production includes: () @uilding the defect into the system *) $dentifying existence of the defects /) Correcting the defect 1) "esting to determine that the defect is removed. Defects uncovered after the system goes into operation generate the followings costs: () Specifying and coding the defect into the system *) Detecting the problem within the application system /) 3eporting the problem to information services and,or the user 1) Correcting the problems caused by the defect 4) &perating the system until the defect is corrected

5) Correcting the defect 8) "esting to determine that the defect no longer exists 9) $ntegrating the corrected programs into production 47) W' "i,, $%#ni5e !'e s f!"#$e -e+e, *)en! !es! !e#) ! member of the pro;ect team with testing experience organi%es the test team. $n many organi%ations, the testing will be performed by an independent test organi%ation. "he test team is separate from the pro;ect team, but may report to and obtains its resources form the pro;ect team. 29) ? " 3 0 "i,, se,e(! !'e !es! !e#) )e);e$s= System designer or manager of your test team #ser Personnel Computer operation staff Data administrators $nternal auditors Euality assurance staff $nformation service management Security administrators Professional testers 21) >is! 0! !'e e,e+en s!e*s in+ ,+e- in s f!"#$e !es!in% *$ (ess S!e*1: !ssess development plan and status S!e*&: Develop the test plan S!e*.: "est software re'uirements S!e*4: "est software design S!e*2: "est software construction S!e*4: xecute tests S!e*7: !cceptance tests S!e*6: 3eport test results S!e*7: "est software installation S!e*19: "est software changes S!e*11: valuate test effectiveness 2&) E/*,#in !'e !es!e$Es " $8;en(' "i!' !'e e,e+en s!e* *$ (ess T'e " $8;en(' in(,0-es: O+e$+ie": ! brief description of the step O;De(!i+e: ! detailed description of the purpose of the step. 1 n(e$ns: Specific challenges that testers will have to overcome to complete the step effectively. W $8;en(': ! description of the process that the testers should follow to complete the step. In*0!: "he documents, information, and skills needed to complete the step. D P$ (e-0$e: Detailed, taskBbyBtask procedures that testers must follow to complete the step.

1'e(8 P$ (e-0$e: ! checklist that testers use to verify that they have performed a step correctly. O0!*0!: "he deliverables that the testers must produce at the conclusion of each step. G0i-e,ines: Suggestions for performing each step more effectively and for a avoiding problems: 2.) W'#! #$e !'e !#s8s '#+e ! ;e *e$f $)e- in S!e*1 T#s8 1: "est pro;ect stimate T#s8 &: "est pro;ect status. 24) E/*,#in !'e $ ,e f ) -e$#! $ in+ ,+e- in S!e* & f !'e e,e+en s!e* *$ (ess: M -e$#! $ R ,e: "he moderator is trained to coordinate, lead, and control the inspection process, and oversee any necessary followBup. Specifically, the moderator: () O$%#ni5es !'e ins*e(!i n ;3 se,e(!in% !'e *#$!i(i*#n!F verifying the distribution of the inspection materialsF and scheduling the overview, inspection, and re'uired followBup sessions. *) >e#-s !'e ins*e(!i n *$ (essF ensures that all participants are preparedF encourages participationF maintains focus on finding defectsF controls flow and directionF and maintains ob;ectivity. /) 1 n!$ ,s !'e ins*e(!i n ;3 enf $(in% #-'e$en(e ! !'e en!$3 #n- e/i! ($i!e$i#F seeks consensus on defectsF makes the final decision on disagreementsF directs the recording and categori%ing of defectsF summari%es inspection resultsF and limits inspections to one to two hours. 1) Ens0$e !'e #0!' $ ( )*,e!es !'e f ,, "-up tasks. 2) 1 )*,e!es #(!i+i!ies ,is!e- in ) -e$#! $ ('e(8,is!: 22) E/*,#in !'e $ ,e f $e#-e$ in+ ,+e- in s!e* & f !'e !es!in% *$ (ess: "he reader is responsible for setting the pace of the inspection. 2e or she does this by paraphrasing or reading the product being inspected. Specifically, the reader $s not also the moderator or author 2as a thorough familiarity with the material to be inspected &b;ectively presents the product Paraphrases or reads the product material line by line , pacing for clarity and comprehension. 24) E/*,#in !'e $ ,e f $e( $-e$ in+ ,+e- in s!e* & f !'e !es!in% *$ (ess: "he recorder is responsible for listing defects and summari%ing the inspection results. Specifically the recorder: .ay also be the moderator, but cannot be the reader or the author 3ecords every defect found Presents the defect list for consensus by all participants in the inspection

Classifies the defects as directed by the inspectors by type, class, and severity based on predetermined criteria. 27) E/*,#in ' " 3 0 "i,, f $) !'e $is8 !e#) in s!e* . : "he team should comprised of three to six members, and at a minimum possess the following skills Gnowledge of the user application #nderstanding of risk concepts !bility to identify controls 0amiliarity with both application and information services risks #nderstanding of information services concepts and system design #nderstanding of computer operations procedures "he candidates included on the risk team should at a minimum include someone from the user area and any of the following: $nternal auditor 3isk consultant Data processor Security officer Computer operations manager 26) E/*,#in !'e !#s8s in+ ,+e- in s!e* . f -esi%n *'#se !es!in% T#s81: Score Success 0actors T#s8&: !nalysis design 0actors T#s8.: Conduct design review T#s84: $nspect design deliverables 27) E/*,#in #; 0! !'e s( $e s0((ess f#(! $s Scoring is a predictive tool that utili%es previous systems experience. xisting systems are analy%ed to determine the attributes of those systems and their correlation to the success or failure of that particular application. &nce the attributes correlating to success or failure can be identified, they can be used to predict the behavior of systems under development. AIG QUESTIONS: KEG POINTS UNIT I: 1) ASSESSING T?E QUA>ITG OF EXISTING TEST PRO1ESS In!$ -0(!i n In*0! *$ -0(!s I)*,e)en!#!i n *$ (e-0$e E/*,#in H F 0$ S!e*s Di#%$#) H S f!"#$e !es!in% #ssess)en! " $8;en(' Ki+i#! ('#$! W $8 P#*e$ 1'e(8 *$ (e-0$es

De,i+e$#;,es &) ASSESSING T?E QUA>ITG OF TESTER In!$ -0(!i n In*0! P$ -0(!s I)*,e)en!#!i n *$ (e-0$e E/*,#in H F 0$ S!e*s Di#%$#) H S f!"#$e !es!in% #ssess)en! " $8;en(' Ki+i#! ('#$! W $8 P#*e$ 1'e(8 *$ (e-0$es De,i+e$#;,es .) STRU1TURED APPROA1? TO TESTING In!$ -0(!i n Di#%$#) H T$#-i!i n#, s f!"#$e -e+e, *)en! ,ife (3(,e Di#%$#) H >ife 13(,e +e$ifi(#!i n #(!i+i!ies Re@0i$e)en!s Desi%n P$ %$#) Tes! P$ (ess Ins!#,,#!i n M#in!en#n(e 4) TEST STRATEGG In!$ -0(!i n Tes! F#(! $s Defini!i n E/*,#in H V#$i 0s #+#i,#;,e Tes! F#(! $s Tes! P'#se - Defini!i n

2) DEVE>OPING A TEST STRATEGG In!$ -0(!i n E/*,#in H F 0$ s!e*s S!e* 1: Se,e(! #n- $#n8 !es! f#(! $s S!e* &: I-en!if3 !'e s3s!e) -e+e, *)en! *'#ses S!e* .: I-en!if3 !'e ;0siness $is8s #ss (i#!e- "i!' !'e s3s!e) 0n-e$ -e+e, *)en! S!e* 4: P,#(e $is8s in !'e )#!$i/ Di#%$#) H Tes! F#(! $BTes! *'#se )#!$i/ UNIT II:

1) EIG?T 1ONSIDERATIONS IN DEVE>OPING TESTING MET?ODO>OGIES: In!$ -0(!i n E/*,#in A(@0i$e #n- s!0-3 f !es! s!$#!e%3 De!e$)ine !'e !3*e f -e+e, *)en! *$ De(! De!e$)ine !'e !3*e f s f!"#$e s3s!e) De!e$)ine !'e *$ De(! s( *e I-en!if3 !'e !#(!i(#, $is8s De!e$)ine "'en !es!in% s' 0,- ((0$ A0i,- !'e s3s!e) !es! *,#n A0i,- !'e 0ni! !es! *,#n Tes!in% T#(!i(s 1'e(8,is! &) STRU1TURA> SGSTEM TESTING TE1?NIQUES In!$ -0(!i n E/*,#in S!$ess Tes!in% O;De(!i+es< ? " ! 0se < W'en ! E/e(0!i n Tes!in% O;De(!i+es< ? " ! 0se < W'en ! Re( +e$3 Tes!in% O;De(!i+es< ? " ! 0se < W'en ! O*e$#!i ns Tes!in% O;De(!i+es< ? " ! 0se < W'en ! 1 )*,i#n(e Tes!in% O;De(!i+es< ? " ! 0se < W'en ! Se(0$i!3 Tes!in% O;De(!i+es< ? " ! 0se < W'en ! Fi%0$e H S!$0(!0$#, Tes!in% Te('ni@0es

0se< E% 0se< E% 0se< E% 0se< E% 0se< E% 0se< E%

UNIT III: 1) FUN1TIONA> SGSTEM TESTING TE1?NIQUES In!$ -0(!i n E/*,#in Re@0i$e)en!s Tes!in% O;De(!i+es< ? " ! 0se < W'en ! 0se< E% Re%$essi n Tes!in% O;De(!i+es< ? " ! 0se < W'en ! 0se< E%

E$$ $-?#n-,in% Tes!in% O;De(!i+es< ? " ! 0se < W'en ! M#n0#, H S0** $! Tes!in% O;De(!i+es< ? " ! 0se < W'en ! In!e$s3s!e) Tes!in% O;De(!i+es< ? " ! 0se < W'en ! 1 n!$ , Tes!in% O;De(!i+es< ? " ! 0se < W'en ! P#$#,,e, Tes!in% O;De(!i+es< ? " ! 0se < W'en ! Fi%0$e H F0n(!i n#, Tes!in% Te('ni@0es

0se< E% 0se< E% 0se< E% 0se< E% 0se< E%

&) Se,e(!in% #n- 0sin% !'e Tes! T ,s In!$ -0(!i n E/*,#in S!e* 1: M#!('in% !'e ! , ! i!s 0se S!e* &: Se,e(!in% # ! , #**$ *$i#!e ! !'e ,ife (3(,e *'#se S!e* .: M#!('in% !'e ! , ! !'e s8i,, ,e+e, f !'e !es!e$ S!e* 4: Se,e(!in% #n #ff $-#;,e ! , Fi%0$e H SD>1 *'#seB !es! ! , )#!$i/ .) A** in!in% M#n#%e$s f $ Tes!in% T ,s In!$ -0(!i n O;De(!i+es E/*,#in S!e* 1: T , M#n#%e$ Se,e(!i n S!e* &: Assi%n !'e ! , )#n#%e$ -0!ies S!e* .: >i)i!in% !'e ! , )#n#%e$Es !en0$e Fi%0$e H T , )#n#%e$Es " $8;en(' f $ )#n#%in% !es!in% ! UNIT IV: 1) ARIEF INTRODU1TION TO T?E 11-STEP SOFTWARE TESTING PRO1ESS: In!$ -0(!i n E/*,#in Assess -e+e, *)en! *,#n #n- s!#!0s De+e, * !'e !es! *,#n Tes! s f!"#$e $e@0i$e)en!s Tes! s f!"#$e -esi%n Tes! s f!"#$e ( ns!$0(!i n E/e(0!e !es!s

,s

A((e*!#n(e !es! Re* $! !es! $es0,!s Tes! s f!"#$e ins!#,,#!i n Tes! s f!"#$e ('#n%es E+#,0#!e !es! effe(!i+eness

&) ASSESS PROIE1T MANAGEMENT DEVE>OPMENT ESTIMATE AND STATUS O+e$+ie" O;De(!i+e 1 n(e$ns W $8;en(' In*0! In*0! 1: P$ De(! P,#n In*0! &: P$ De(! Es!i)#!e In*0! .: De+e, *)en! P$ (ess D P$ (e-0$e T#s81: Tes! P$ De(! Es!i)#!e T#s8 &: Tes! P$ De(! S!#!0s 1'e(8 P$ (e-0$es W $8 *#*e$ H Q0es!i ns O0!*0! Tes! Re* $! H A-e@0#(3 f !'e !es! es!i)#!e #n- !'e $e#s n#;,eness f !'e *$ De(! s!#!0s: G0i-e,ines

.) DEVE>OP TEST P>AN O+e$+ie" O;De(!i+e 1 n(e$ns W $8;en(' In*0! In*0! 1: P$ De(! P,#n In*0! &: P$ De(! P,#n Assess)en! #n- S!#!0s D P$ (e-0$e T#s81: F $) Tes! Te#) T#s8&: Un-e$s!#n- !'e P$ De(! Ris8sB 1 n(e$ns T#s8 .: Ins*e(! Tes! P,#n 1'e(8 P$ (e-0$es W $8 *#*e$ H Q0es!i ns O0!*0! Tes! P,#n:

G0i-e,ines S!#$! E#$,3 Kee* !'e !es! *,#n f,e/i;,e F$e@0en!,3 $e+ie" !'e !es! *,#n Kee* !'e !es! *,#n ( n(ise #n- $e#-#;,e 1#,(0,#!e !'e *,#nnin% eff $! S*en- !'e !i)e ! - # ( )*,e!e !es! *,#n 4) REQUIREMENTS P?ASE TESTING O+e$+ie" O;De(!i+e 1 n(e$ns W $8;en(' In*0! In*0! 1: P$ De(! De,i+e$#;,es -efinin% Re@0i$e)en!s In*0! &: Re@0i$e)en!s G#!'e$in% P$ (ess D P$ (e-0$e T#s81: P$e*#$e Ris8 M#!$i/ T#s8&: Pe$f $) Tes! F#(! $ An#,3sis T#s8 .: 1 n-0(! # Re@0i$e)en!s W#,8!'$ 0%' 1'e(8 P$ (e-0$es W $8 *#*e$ H Q0es!i ns O0!*0! Tes! Re* $! H In-i(#!in% Re@0i$e)en!s Defi(ien(ies: G0i-e,ines

2) DESING P?ASE TESTING O+e$+ie" O;De(!i+e 1 n(e$ns W $8;en(' In*0! In*0! 1: Desi%n P$ (ess In*0! &: Desi%n P'#se De,i+e$#;,es D P$ (e-0$e T#s81: S( $e S0((ess F#(! $s T#s8&: An#,3sis Desi%n F#(! $s T#s8.: 1 n-0(! Desi%n Re+ie" T#s84: Ins*e(! Desi%n De,i+e$#;,es

1'e(8 P$ (e-0$es W $8 *#*e$ H Q0es!i ns O0!*0! T" 1#!e% $ies f 0!*0! is *$ -0(e-: 1) Defi(ien(ies 0n( +e$e- in !'e -esi%n $e+ie" &) T'e -efe(!s #n- !'e #ssess)en! *$ -0(e- ;3 !'e ins*e(!i n *$ (ess: One f !'e . #ssess)en!s f !'e *$ -0(! ;ein% ins*e(!e-: 1) N -efe(! f 0n-: &) Min $ W $8 $e@0i$e.) M#D $ $e" $8 $e@0i$e G0i-e,ines 4) PROGRAM P?ASE TESTING O+e$+ie" O;De(!i+e 1 n(e$ns W $8;en(' In*0! In*0! 1: P$ %$#) P'#se P$ (ess In*0! &: P$ %$#) P'#se De,i+e$#;,es D P$ (e-0$e T#s81: Des8 De;0% P$ %$#) T#s8&: P$ %$#) P'#se Tes! F#(! $ An#,3sis T#s8.: Pee$ Re+ie" 1'e(8 P$ (e-0$es W $8 *#*e$ H Q0es!i ns O0!*0! T" 0!*0!s is *$ -0(e-: 1) F0,,3 -e;0%%e- *$ %$#) 0sin% s!#!i( !es!in% ! 0n( +e$ #n$e) +e -efe(!s: &) A ,is! f !'e -efe(!s 0n( +e$e- -0$in% !es!in%: G0i-e,ines 7) DESING P?ASE TESTING O+e$+ie" O;De(!i+e 1 n(e$ns W $8;en(' In*0! In*0! 1: O*e$#!i n#, Tes! En+i$ n)en! In*0! &: Tes! P,#n In*0! .: Tes! P$ %$#) >i;$#$3 In*0! 4: S3s!e) B P$ %$#) D (0)en!#!i n D P$ (e-0$e T#s81: A0i,- Tes! D#!#B S($i*!s

T#s8&: E/e(0!e Tes!s T#s8.: Re( $- Tes! Res0,!s 1'e(8 P$ (e-0$es W $8 *#*e$ H Q0es!i ns O0!*0! T'$ee 0!*0!s f$ ) !'is s!e*: #) T'e !es! !$#ns#(!i ns nee-e- ! +#,i-#!e !'e s f!"#$e s3s!e) ;) T'e $es0,!s f$ ) e/e(0!in% !' se !$#ns#(!i ns () V#$i#n(es f$ ) e/*e(!e- $es0,!s: G0i-e,ines UNIT V: 1) TESTING 1>IENTBSERVER SGSTEMS O+e$+ie" E/*,#in H 1,ien!BSe$+e$ S3s!e) Di#%$#) H A$('i!e(!0$e: 1,ien!BSe$+e$ O;De(!i+e 1 n(e$ns W $8;en(' In*0! In*0! 1: 1,ien!BSe$+e$ S3s!e) D P$ (e-0$e T#s81: Assess Re#-iness T#s8&: Assess Ke3 1 )* nen!s T#s8.: Tes! S3s!e) 1'e(8 P$ (e-0$es W $8 *#*e$ H Q0es!i ns O0!*0! Tes! Re* $! H In-i(#!in% W'#! " $8s #n- "'#! - es n ! " $8: A,s ( n!#in $e( ))en-#!i ns ;3 !'e !es! !e#) f $ i)*$ +e)en!s "'e$e #**$ *$i#!e: G0i-e,ines &) TESING RAPID APP>I1ATION DEVE>OPMENT O+e$+ie" O;De(!i+e 1 n(e$ns W $8;en(' In*0! In*0! 1: RAD S3s!e) Re@0i$e)en!s

D P$ (e-0$e T#s81: Tes! P,#nnin% I!e$#!i ns T#s8&: Tes! S0;se@0en! P,#nnin% I!e$#!i ns T#s8.: Tes! Fin#, *,#nnin% I!e$#!i ns 1'e(8 P$ (e-0$es W $8 *#*e$ H Q0es!i ns O0!*0! Tes! Re* $! H Fin-in% #! !'e en- f !'e !es!in% f e#(' i!e$#!i n f !'e RAD -e+e, *)en!: T' se $e* $! in-i(#!e W'#! " $8s #n- "'#! - es n ! " $8s I! #,s ( n!#in !es!e$s $e( ))en-#!i n f $ i)*$ +e)en!: G0i-e,ines .) TESTING T?E ADEQUA1G OF SGSTEM DO1UMENTATION O+e$+ie" O;De(!i+e 1 n(e$ns W $8;en(' In*0! In*0! 1: D (0)en!#!i n S!#n-#$-s In*0! &: S3s!e) D (0)en!#!i n D P$ (e-0$e T#s81: Me#s0$e P$ De(! D (0)en!#!i n Nee-s T#s8&: De!e$)ine i)* $!#n(e f - (0)en!s T#s8.: De!e$)ine ( )*,e!eness f - (0)en!s T#s84: De!e$)ine 10$$en!ness f - (0)en!s 1'e(8 P$ (e-0$es W $8 *#*e$ H Q0es!i ns O0!*0! Tes! Re* $! H O0!,inin% -efi(ien(ies "i!'in s3s!e)s - (0)en!#!i n: T'e -efi(ien(ies s' 0,- ;e ;#se- fi$s! n +#$i#n(e f$ ) s!#n-#$-s< #n- se( n- n f#i,0$e ! )ee! !'e in!en! f !'e s!#n-#$-s: G0i-e,ines 4) TESTING WEAJAASED APP>I1ATION O+e$+ie" E/*,#in H We; ;#se- A$('i!e(!0$e O;De(!i+e 1 n(e$ns W $8;en(' In*0! In*0! 1: We; S f!"#$eB?#$-"#$e

D P$ (e-0$e T#s81: Se,e(! "e;-;#se- $is8s ! in(,0-e in !es! *,#n T#s8&: Se,e(! "e;-;#se- !es!s T#s8.: Se,e(! "e;-;#se- !es! ! ,s T#s84: Tes! "e;-;#se- s3s!e)s 1'e(8 P$ (e-0$es W $8 *#*e$ H Q0es!i ns O0!*0! : We; Tes! Re* $! H T'is $e* $! s' 0,- ( n!#in: A$ief -es($i*!i n f !'e "e;-;#se- s3s!e) Ris8s #--$esse- #n- n ! #--$esse- ;3 !'e "e;-;#se- !es! !e#) T3*es f !es!in% *e$f $)e-< #n- !3*es f !es!in% n ! *e$f $)e Tes! ! ,s 0se We;-;#se- f0n(!i n#,i!3 #n- s!$0(!0$e !es!e- !'#! *e$f $)e( $$e(!,3 We;-;#se- s!$0(!0$e #n- f0n(!i n#,i!3 !es!e- !'#! -i- n ! *e$f $) ( $$e(!,3 We;-;#se- !es! !e#)Es *ini n $e%#$-in% !'e #-e@0#(3 f !'e "e;-;#se- s3s!e) ! ;e *,#(e- in! # *$ -0(!i n s!#!0s G0i-e,ines Ge! Seni $ M#n#%e)en! s0** $! f $ ;03in% #n- in!e%$#!in% !es! ! ,s: Kn " G 0$ Re@0i$e)en!s Ae $e#s n#;,e in 3 0$ e/*e(!#!i ns H s!#$! s)#,, #n- %$ " ?#+e # s!$ n% !es!in% *$ (ess !'#! in(,0-es ! ,s D nE! (0! !'e !$#inin% ( $ne$ 2) TESTING OFF-T?E-S?E>F SOFTWARE O+e$+ie" Define Off H !'e H s'e,f s f!"#$e O;De(!i+e 1 n(e$ns W $8;en(' In*0! In*0! 1: Use$ M#n0#,s In*0! &: S f!"#$e D P$ (e-0$e T#s81: Tes! A0siness Fi! T#s8&: Tes! O*e$#!i n#, Fi! T#s8.: Tes! Pe *,e Fi! T#s84: A((e*!#n(e Tes! S f!"#$e P$ (essin% 1'e(8 P$ (e-0$es W $8 *#*e$ H Q0es!i ns

O0!*0! OTSS Assess)en! HT'e$e #$e !'$ee * !en!i#, 0!*0!s F0,,3 A((e*!#;,e Un#((e*!#;,e A((e*!#;,e "i!' ( n-i!i ns G0i-e,ines S*en- !i)e f $ ,e#$nin% #n- e+#,0#!in% s f!"#$e< #n- 3 0 "i,, %#in *$ ;,e)-f$ee 0se f !'#! s f!"#$e On,3 #(@0i$e ( )*0!e$ s f!"#$e #f!e$ 3 0 '#+e es!#;,is'e!'e nee- f $ !'#! s f!"#$e #n- (#n -e) ns!$#!e ' " i! "i,, ;e 0se- in -#3-! --#3 " $8: Ins!in(! $e%#$-in% % -ness #n- ;#-ness s' 0,- ;e 0se- ! 'e,* 3 0 se,e(! s f!"#$e: 4) TESTING IN A MU>TIP>ATFORM ENVIRONMENT O+e$+ie" O;De(!i+e 1 n(e$ns W $8;en(' In*0! In*0! 1: P,#!f $)s In(,0-eIn*0! &: S f!"#$e D P$ (e-0$e T#s81: Desi%n *,#!f $) ( nfi%0$#!i n ( n(e$ns T#s8&: >is! nee-e- *,#!f $) ( nfi%0$#!i n T#s8.: Assess Tes! R )s T#s84: >is! S f!"#$e S!$0(!0$e P,#!f $)s Effe(!s T#s82: >is! In!e$f#(es P,#!f $) Effe(!s T#s84: E/e(0!e Tes!s 1'e(8 P$ (e-0$es W $8 *#*e$ H Q0es!i ns O0!*0! Tes! Re* $!: In-i(#!in% S!$0(!0$#, 1 )* nen! !'#! " $8 $ - nE! " $8 ;3 *,#!f $) In!e$f#(es !'#! " $8 $ - nE! " $8 ;3 *,#!f $) M0,!i*,#!f $) *e$#!i n#, ( n(e$ns !'#! '#+e ;een e,i)in#!e- $ s0;s!#n!i#!e P,#!f $)s n "'i(' !'e s f!"#$e s' 0,- *e$#!e< ;0! !'#! '#+e n ! ;een !es!e-: G0i-e,ines 7) TESTING SE1URITG O+e$+ie"

O;De(!i+e 1 n(e$ns W $8;en(' In*0! In*0! 1: Te#) 8n ",e-%e #; 0! !'e se(0$e- , (#!i nsB inf $)#!i nEs s3s!e)s D P$ (e-0$e T#s81: I-en!if3 P !en!i#, Pe$*e!$#! $s T#s8&: I-en!if3 P !en!i#, P in!s f Pene!$#!i n T#s8.: 1$e#!e # Pene!$#!i n P in! M#!$i/ T#s84: I-en!if3 ?i%' Ris8s P in!s f Pene!$#!i n T#s82: E/e(0!e Se(0$i!3 Ris8s 1'e(8 P$ (e-0$es W $8 *#*e$ H Q0es!i ns O0!*0! Pene!$#!i n P in! M#!$i/HI-en!if3in% !'e 'i%'-$is8 * in!s f *ene!$#!i n: G0i-e,ines 6) TESTING A DATA WARE?OUSE O+e$+ie" O;De(!i+e 1 n(e$ns W $8;en(' In*0! In*0! 1: D#!# W#$e' 0se A(!i+i!3 P$ (esses D P$ (e-0$e T#s81: Me#s0$e !'e )#%ni!0-e f D#!# W#$e' 0se 1 n(e$ns T#s8&: I-en!if3 D#!# W#$e' 0se A(!i+i!3 P$ (esses ! Tes! T#s8.: Tes! !'e #-e@0#(3 f D#!# W#$e' 0se A(!i+i!3 P$ (esses 1'e(8 P$ (e-0$es W $8 *#*e$ H Q0es!i ns O0!*0! Assess)en! Res0,!s - T #ss0$e !'e -#!# "#$e' 0se #(!i+i!3 is effe(!i+e,3 *e$#!e-: G0i-e,ines

You might also like