Professional Documents
Culture Documents
As we know that hardware development engineers produce hardware. Software development engineers produce software. Same like this, Software test engineers produce testware. Testware is produced by both verification and validation testing methods. Testware includes test cases, test plan, test report and etc. Like software, testware should be placed under the control of a configuration management system, saved, faithfully maintained. Like software, the testware has significant value because it can be reused. The testers job is to create testware that is going to have a specified lifetime and is valuable asset to the company.
Test Plan talks about "What has to be tested" while Test Cases talk about "How to test" because of which both these documents are of equal importance in testing.
requirements. It is important to note that the QC people should not be completely isolated from the development team. The QC engineer could benefit a great deal from discussing the structure of the code and the design details with the developers. This kind of discussion might give her ideas for more test cases, for example. The difference between discussing the structure of the code and discussing the interpretation of the requirements is that verifying the latter is in fact the purpose of the QC process.
Testing standards:
General testing standards for testing any type of application based on various factors are listed below: 1. Look & Feel -> Uniformity in terms of Content, Title, Position (Should be displayed at the center of any User Interface (UI)) of message boxes. -> Enabling & Disabling of menu items/icons according to the user security. -> Ensure menu items are invoked as set in user security. If the service is not available for testing then that particular menu item should be disabled. -> Navigation using tab key should be proper across fields and should move from right to left, then top to bottom. -> Scrolling effect of Vertical & Horizontal scrollbars should be proper. -> Alignment of controls should be proper. -> Spacing between controls should be proper. -> Ensure the uniformity in font (Type, Size, Color) -> In case of Multi line text box, Press Enter key to go for the next line. -> Ensure 'Backspace' & 'Space bar' are working properly, wherever applicable. -> In case of List / Combo boxes, press ALT+DOWN to display the list values and use Down Arrow key for selection. -> Esc key should activate Cancel button and Enter key should activate OK button. -> Check for the spelling in Message Box, Titles, Help files and Tool Tip. -> In case of reports check for the proper display of column headers on different Zooming effect. -> Check ToolTip text is provided for all icons in the UI. 2. Functionality Testing -> Check for the functionality of the application. The entire flow of the application has to be checked. -> In functionality testing both Positive and Negative testing is done. 2.1.Positive Testing -> Check for positive functionality of the application. -> Check for field validation using positive values within the permissible limits. 2.2.Negative Testing -> Give numbers in char fields and vice versa -> Give only numbers in the alphanumeric fields -> Know the permissible range for each field and check using values exceeding the limits. Use Equivalence Partitioning and Boundary Value Analysis techniques for deciding on the test data. -> Without giving values for mandatory fields, save data. -> Random clicking and continuous tab out (especially in grids for application error).
-> Check for the space and updating with blank fields wherever applicable. -> Maximize and minimize the screens and check for the toolbar display. 3. Menu Organization -> Simultaneous opening of screens. 4. Help files -> F1 should invoke context sensitive help files. 5. User Interface Traversal -> For all mouse driven operation there should be a keyboard function , Short cut keys & Alternate Keys where ever applicable in menu. 6. Date Format -> The application should support various date formats in regional settings.
3. How Does Traceability Ensure the Life Cycle is Followed? 1. It demonstrates the relationship between design inputs and design outputs 2. It ensures that design is based on predecessor, established requirements 3. It helps ensure that design specifications are appropriately verified, that functional requirements are appropriately validated 4. Important: Traceability is a 2-way street. Maintain "backwards" and "forwards" -Tunnel Vision not acceptable in the Software Life Cycle! 4. Traceability Across the Life Cycle 1. Risk Analysis (Initial and Ongoing Activities) 1. Trace potential hazards to their specific cause 2. Trace identified mitigations to the potential hazards 3. Trace specific causes of software-related hazards to their location in the software 2. Requirements Analysis and Specification 1. Trace Software Requirements to System Requirements 2. Trace Software Requirements to hardware, user, operator and software interface requirements 3. Trace Software Requirements to Risk Analysis mitigations 3. Design Analysis and Specification 1. Trace High-Level Design Specifications to Software Requirements 2. Trace Design Interfaces to hardware, user, operator and software interface requirements 3. Evaluate design for introduction of hazards; trace to Hazard Analysis as appropriate 4. Design Analysis and Specification 1. Trace Detailed Design Specifications to High-Level Design 2. IMPORTANT: Ability to demonstrate traceability of safety critical software functions and safety critical software controls to the detailed design specifications 5. Source Code Analysis (Implementation) 1. Trace Source Code to Detailed Design Specifications 2. Trace unit tests to Source Code and to Design Specifications 1. Verify an appropriate relationship between the Source Code and Design Specifications being challenged 6. Source Code Analysis (Implementation) 1. Trace Source Code to Design Specifications 2. Trace unit tests to Source Code and to Design Specifications 1. Verify an appropriate relationship between the Source Code and Design Specifications being challenged 7. Integration 1. Trace integration tests to High-Level Design Specifications 2. IMPORTANT: Use High-Level Design Specifications to establish a rational approach to integration, to determine regression testing when changes are made 8. Validation 1. Trace system tests to Software Requirement Specifications 2. Use a variety of test types 1. Design test cases to address concerns such as robustness, stress, security, recovery, usability, etc. 3. Use traceability to assure that the necessary level of coverage is achieved
5. Plan Ahead for Traceability 1. Options 1. Manual methods 1. Word processors 2. Spreadsheets 2. "Home-built" Automated Systems 1. Relational Databases 3. Commercial Automated Systems 1. DOORS 2. Requisite Pro
Rule 5: Estimation shall be supported by tools Tools (e.g a spreadsheet containing metrics) that help to reach the estimation quickly should be used. In this case, the spreadsheet calculates automatically the costs and duration for each testing phase. Also, a document containing sections such as: cost table, risks, and free notes should be created. This letter should be sent to the customer. It also shows the different options for testing that can help the customer decide which kind of test he needs. Rule 6: Estimation shall always be verified Finally, all estimation should be verified. Another spreadsheet can be created for recording the estimations. The estimation is compared to the previous ones recorded in a spreadsheet to see if they have similar trend. If the estimation has any deviation from the recorded ones, then a reestimation should be made.