Software Testing: Black box testing

SE 110

SE 110 – Spring 2013

Black-box testing
• Black box testing is done without knowledge of the internals of system under test. • It is done from the customer’s view point and involves looking at the specifications only.
– It requires a functional knowledge of the product to be tested.

• Black box tests are convenient to administer as they use the complete finished product and do not require any knowledge of its construction.
SE 110 – Spring 2013

Characteristics of black-box testing
• Done based on requirements. • Addresses (should address) stated as well as implied requirements. • Encompasses the end-user perspective. • Checks for valid and invalid conditions / inputs. • May or may not know the technology aspects of the product.
SE 110 – Spring 2013

Typical errors found in black-box testing
• • • • • • Incorrect or missing functionalities Interface errors Errors in data structures/data base access Behavior errors Performance errors Initialization and termination errors

SE 110 – Spring 2013

Black-box vs. where code is involved SE 110 – Spring 2013 . White-box testing Black Box Testing Has no access to program code Requires external perspective White Box Testing Has access to program code Requires knowledge of program code Set of techniques applicable to all other phases of testing Typically applies only to unit testing.

Black-box testing techniques • • • • • • • • • Requirements-based testing Positive and negative testing Boundary value analysis Decision tables Equivalence partitioning State-based testing Compatibility testing User documentation testing Domain testing (leads to ad hoc testing) SE 110 – Spring 2013 .

SE 110 – Spring 2013 . Walk through the examples. List out one or two examples.General format for discussion of techniques • • • • Present some reasoning where applicable. Summarize the process for using the technique.

Black-box testing techniques • • • • • • • • • Requirements-based testing Positive and negative testing Boundary value analysis Decision tables Equivalence partitioning State-based testing Compatibility testing User documentation testing Domain testing (leads to ad hoc testing) SE 110 – Spring 2013 .

• This is the genesis of a Requirements Traceability Matrix (RTM). • Differentiates between implicit and explicit requirements. correct. SE 110 – Spring 2013 . • A reviewed SRS tabulates requirements. • Review requirements first to ensure they are consistent. along with a requirements id and a priority. complete and testable.Requirements based testing • Done to ensure that all requirements in SRS are tested. • Review enables translation of (some of) the implied requirements to stated requirements.

Component Unit. Test_004. Test_005 Unit.RTM: Example Req. Component Integration •Test condition: Different ways of testing a requirement (different types of mapping) •Test case: Different conditions / scenarios for a given requirement •Phase of testing – helps in scheduling SE 110 – Spring 2013 . ID Description BR-01 BR-02 BR-03 Priority High High Medium Test conditions Test case Phase of IDs testing Test_001 Test_002 Test_003.

Black-box testing techniques • • • • • • • • • Requirements-based testing Positive and negative testing Boundary value analysis Decision tables Equivalence partitioning State-based testing Compatibility testing User documentation testing Domain testing (leads to ad hoc testing) SE 110 – Spring 2013 .

– No direct mapping to a specific requirement. • Negative testing is done to show that the product does not fail when given unexpected inputs. – Tries to break the system. – “Coverage” more challenging. – “Coverage” is defined better. SE 110 – Spring 2013 .Positive and negative testing • Positive testing is done to check that the product does what it is supposed to. – Maps to a specific requirement. – Behaves correctly when given right inputs.

Black-box testing techniques • • • • • • • • • Requirements-based testing Positive and negative testing Boundary value analysis Decision tables Equivalence partitioning State-based testing Compatibility testing User documentation testing Domain testing (leads to ad hoc testing) SE 110 – Spring 2013 .

Boundary Value Analysis (BVA) • Most defects come up near “boundaries” • Reasons from a white box perspective: – Programmers tentativeness in using the right relational operator (< or < = ?) – Multiple ways of implementing loops – Confusing array subscripts • Reasons from a black box perspective: – Unclear requirements – Ambiguous or “it depends” mindset! SE 110 – Spring 2013 .

BVA: Example • Database starts with a pre-allocating number of buffers for caching • Buffers filled up as needed • If full. buffers freed on a FIFO basis SE 110 – Spring 2013 .

4MB in this case). • Look for any internal limits like limits on resources (like the example of buffers given above). make sure you include test cases for the minimum RAM (i. • Also include in the list of boundary values. The behavior of the product at these limits should also be the subject of boundary value testing. SE 110 – Spring 2013 ..e. if it is documented that a product will run with minimum 4MB of RAM. requiring thorough testing. For example. • The examples given above discuss boundary conditions for input data – the same analysis needs to be done for output variables also.BVA: Examples • Look for any kind of gradation or discontinuity in data values that affects computation – the discontinuities are the boundary values. documented limits on hardware resources.

Black-box testing techniques • • • • • • • • • Requirements-based testing Positive and negative testing Boundary value analysis Decision tables Equivalence partitioning State-based testing Compatibility testing User documentation testing Domain testing (leads to ad hoc testing) SE 110 – Spring 2013 .

Decision tables • A program’s behavior is characterized by several decision variables. • Each decision variable specifies a Boolean condition. • The distinct combinations of these decision variables lead to different scenarios. and the row also has expected results. DON’T CARE). • One representative data point from each scenario needs to be tested. FALSE. • Each scenario occupies a row in the decision table. – Input and output data can be expressed as Boolean conditions (TRUE. SE 110 – Spring 2013 .

Decision tables: Example • Taxpayers have a choice of either taking a standard deduction (SD) or itemizing their deductions. • Various factors determine SD: – – – – Single $4750 Married and filing a joint return $9500 Married and filing a separate return $7000 If filer or spouse is 65 years or older and additional SD of $1000 is allowed – If filer or spouse is blind an additional SD of $1000 is allowed SE 110 – Spring 2013 .

filing joint return Claimed SD No --- No --- $7000 --- Yes --- No No $10.Decision tables: Example Status Status of spouse --- Age (<65 or >65) No Age of spouse Blind (yes or No) No Spouse blind SD amount Single --- --- $4750 Married.500 SE 110 – Spring 2013 . filing separate return Married.

• Form a table. • Identify the cases in which values assumed by a variable (or by sets of variables) are immaterial for a given combination of other input variables.Decision tables: Process • Identify the decision variables. On the last column. list out the action or expected result. as appropriate) SE 110 – Spring 2013 . – Represent such variables by the Don’t Care symbol. listing on each but the last column one decision variable. • Identify the possible values of each of the decision variables. • For each combination of values of decision variables (appropriately minimized with the Don’t Care scenarios). list the action item for the combination of variables in that row (including Don’t Cares. • Enumerate the combinations of the allowed values of each of the variables.

Black-box testing techniques • • • • • • • • • Requirements-based testing Positive and negative testing Boundary value analysis Decision tables Equivalence partitioning State-based testing Compatibility testing User documentation testing Domain testing (leads to ad hoc testing) SE 110 – Spring 2013 .

• Divide the (potentially infinite) set of values into a set of equivalence classes or partitions. SE 110 – Spring 2013 . • Results of the test for that one element is extrapolated to all the other elements.Equivalence partitioning • Generalization of BVA / decision table. • One element of the class can act as a representative for the entire class.

. PN. P2. • From a testing perspective..Equivalence partitioning: Basic hypothesis • Suppose an input domain is divided into equivalence class partitions P1. SE 110 – Spring 2013 . ONE input from each partition is sufficient for the purpose of testing. • The behavior of the system is IDENTICAL for all inputs belonging to the same partition Pi.

. – The total number of equivalence classes is a Cartesian product of all the individual equivalence classes.Equivalence Classes (EqC) • Single-domain Equivalence Classes – Equivalence classes are formed on the basis of analysis of only one input domain (e.. SE 110 – Spring 2013 . • Multi-domain Equivalence Classes – Multiple domains (e. age and gender) can be used together.g. age).g.

– 60 credit hours for MS by Research – 72 credit hours for Ph.D. MS by Research Ph.D.Tech. What are the equivalence classes here? 0 to 60 credit hours 60 to 72 credit hours More than 72 credit hours M.Single-Domain EqC: Example • Credit hour requirements for students are given as follows: – 60 credit hours for M.Tech. SE 110 – Spring 2013 .

Three message types are supported (0-2). The processor is responsible for receiving a message buffer and processing the message that is present inside the message buffer. The first byte of the message buffer comprises the header.Multi-Domain EqC: Example processMessage ( messageBuffer ) processMessage is the central component of a message processing software. The actual processing logic of the message is implemented using thirdparty libraries that are available. The header is interpreted as unsigned integer and indicates message type. The maximum size of the message body is 1023 bytes. SE 110 – Spring 2013 .

Multi-Domain EqC: Example Number of bytes in body B3 INVALID 1023 B2 1 B1 INVALID VALID INVALID INVALID H1 INVALID H2 0 2 Header value 255 Header Equivalence Classes H1 H2 B3 Body Equivalence Classes SE 110 – Spring 2013 B1 B2 .

H1>.Multi-domain EqC: Example • The cross product of the partitions gives the exhaustive list of combinations of input to test. SE 110 – Spring 2013 .H2>.<B3.H1>. <B1.<B3. – Body (B1. <B2.H1>. <B2.H2> • Pick ONE test case for each combination. B3) and Header (H1.H2) would give the combinations <B1.H2>. B2.

Equivalence Class Test cases Number of bytes in body B3 INVALID 1023 B2 1 B1 INVALID VALID INVALID INVALID H1 INVALID H2 0 2 Header value 255 Test Case A total of 6 test cases are sufficient! SE 110 – Spring 2013 .

list of values. allowed values.Equivalence partitioning: Process • Choose criteria for doing the equivalence partitioning (range. or find an inappropriate answer. • Identify the valid equivalence classes based on the above criteria (number of ranges.) . mark appropriately and escalate for corrective actions.). if any. • Identify special values. and include them in the table. • Check to have expected results for all the cases prepared • If the expected result is not clear for any particular test case. SE 110 – Spring 2013 . etc. • Write the expected result based on the requirements given. consider whether you want to record this issue on your log. If you cannot answer a question. etc. • Select a sample data from that partition.

Black-box testing techniques • • • • • • • • • Requirements-based testing Positive and negative testing Boundary value analysis Decision tables Equivalence partitioning State-based testing Compatibility testing User documentation testing Domain testing (leads to ad hoc testing) SE 110 – Spring 2013 .

SE 110 – Spring 2013 . – Dataflow modeling. wherein the syntax of the language automatically lends itself to a state machine. depending on the current state and appropriate combination of inputs. – Workflow modeling where. specific workflows are carried out. resulting in new output and new state. where the system is modeled as a state machine.State/graph based testing • Useful for – Language processors or compilers.

which. SE 110 – Spring 2013 . – Optional sign can be followed by any number of digits. in turn. – If there is a decimal point. can be followed by a decimal point.State-based testing: Example #1 sign 1 digit digit decimal point 2 3 digit 4 digit 5 blank 6 blank • Validation of a number: – A number can start with an optional sign. then there should be two digits after the decimal. – Any number. should be terminated by a blank. whether or not it has a decimal point.

application is rejected. – If yes.State-based testing: Example #2 Employee Desires Leave Verify Eligibility Eligible Leave Application Form Manager Ensure Feasibility Feasible Approve Not Feasible Ineligible Reject • An employee fills up leave application. – If not. information goes to manager who validates the leave and gives the final approval. SE 110 – Spring 2013 . • Information goes to an automated system which validates whether the employee is eligible for leave.

the scenario can be a context-free grammar. we have represented the diagram as a state machine.State-based testing: Process • Identify the grammar for the scenario. • Design test cases corresponding to the most common invalid combinations of state-input. • Design test cases corresponding to each valid state-input combination. SE 110 – Spring 2013 . which may require a more sophisticated representation of a “state diagram”. In the above example. with memory etc. In some cases.

Black-box testing techniques • • • • • • • • • Requirements-based testing Positive and negative testing Boundary value analysis Decision tables Equivalence partitioning State-based testing Compatibility testing User documentation testing Domain testing (leads to ad hoc testing) SE 110 – Spring 2013 .

• Forward compatibility – Testing to ensure that the risk involved in product for future requirements is minimized. early access version of the developer’s kit etc.Compatibility testing • To ensure that the product works consistently with infrastructure components • Could be parameters of hardware. network… – Compatibility matrix (Example in next slide) • Backward compatibility – Testing to ensure that product parameters that are created with an older version of the product continue to work with the current version of the same product. OS. – Examples: Testing a product with a beta version of the operating system. SE 110 – Spring 2013 .

Net framework 1.Net framework 1.1 IIS5.1. 7.5 SP2 MS Office Office 2K and Office XP Mail server Exchange 5.5 and 2K SE 110 – Spring 2013 .0 and IE 5.0 Client Win2K Profession al and Win 2K Terminal Server Browser IE 6.1 IIS 5.0 Win2K Profession al and Win 2K Terminal Server Netscape 7.Compatibility matrix: Example Server Windows 2000 Advanced Server with SP4 Microsoft SQL Server 2000 with SP3a Windows 2000 Advanced Server with SP4 Microsoft SQL Server 2000 with SP3a Application Web server server Windows 2000 Advanced Server with SP4 and .0. Safari and Mozilla Office 2K and Office XP Exchange 5.5 and 2K Windows 2000 Advanced Server with SP4 and .

Black-box testing techniques • • • • • • • • • Requirements-based testing Positive and negative testing Boundary value analysis Decision tables Equivalence partitioning State-based testing Compatibility testing User documentation testing Domain testing (leads to ad hoc testing) SE 110 – Spring 2013 .

software release notes and on-line help. set-up guides.User documentation testing • To check if what is stated in the document is available in the product. • To check if what is there in the product is available in the document. SE 110 – Spring 2013 . installation guides. • Documents include user guides. read-me files.

• Contributes to better customer satisfaction and better morale of support staff. thus minimizing possible defects reported by customers. – Reduced training costs for customers. • Customers need less training and can proceed more quickly to advanced training and product usage.Documentation testing: Benefits • User documentation testing aids in highlighting problems that have been overlooked during reviews. • Results in less difficult support calls. SE 110 – Spring 2013 . • New programmers and testers who join a project group can use the documentation to learn the external functionality of the product. • High-quality user documentation ensures the consistency of documentation and product.

Black-box testing techniques • • • • • • • • • Requirements-based testing Positive and negative testing Boundary value analysis Decision tables Equivalence partitioning State-based testing Compatibility testing User documentation testing Domain testing (leads to ad hoc testing) SE 110 – Spring 2013 .

Domain testing SE 110 – Spring 2013 .

SE 110 – Spring 2013 . not “logic” or “steps”. • Tests what the users do on a typical business day.Domain testing • Have “domain people” perform tests by using the software. • Business flow determines the test. • Captures the typical problems faced by users (not necessarily captured in SRS).

with each range exhibiting a particular functionality Input values being divided into classes (like ranges. process flows or language processors To ensure that requirements are tested and met properly Positive and negative testing Graph based testing Requirements Based Testing To test using the domain expertise rather than the product specification To ensure that the documentation is consistent with the product Domain Testing Documentation Testing SE 110 – Spring 2013 . list of values. with each class exhibiting a particular functionality Decision Tables Boundary Value Analysis Equivalence Partitioning Checking for expected and unexpected input values Workflows. etc).When you want to test scenarios that have… When to use what… The most effective black box testing technique is likely to be… Output values are dictated by certain conditions depending upon values of input variables Input values being in ranges.

Sign up to vote on this title
UsefulNot useful