Professional Documents
Culture Documents
SUMMARY:
• A creative and professional team player with over five years of diverse
experience in Software Testing and Quality Assurance.
• Excellent understanding of Software Development Life Cycle and Test
Methodologies from project definition to post-deployment documentation
• Experience in writing test plans, defining test scenarios and test cases,
developing and maintaining test scripts based on business and functional
requirements. Documented all phases of QA process.
• Extensive experience in working with Insurance, Financial, Brokerage and E-
commerce (B2B) Applications.
• Extensive experience in running various kinds of tests such as Regression,
Functional, Performance, Backend, User Acceptance Test (UAT) for Web and
Client/Server Applications.
• Proficient in using tools such as Win runner, Quality Center, QucikTestPro, and
Load Runner.
• Extensive experience in writing Test Scripts using TSL in Win Runner and VB
script in QTP
• Ability to interact with developers and product analysts regarding testing status
and defect tracking.
• Self-starter and team player with excellent communication, problem solving skills,
interpersonal skills and a good aptitude for learning.
TECHNICAL SKILLS:
PROFESSIONAL EXPERIENCE:
Description:
The objective of this project is to build a complete timesheet system for PIC. This project included
the Conversion of an existing paper based timesheet management to a web-based timesheet
management system. The time entry and reporting on projects would be done by activities of the
users (Employees of PIC on the new website). The Administrator would set up the system by
creating & managing the Projects and also linking the people with projects/activities/operations.
The Managers would set the tasks within their own projects/activities/operations. All the users
enter the time they have spent on the tasks allotted to them. The managers would analyze the
time entered and review the reports. Involved in helping with some Business Analysis activities.
Responsibilities:
• Involved in designing and developing high-level test plans based on the functional
specifications and user requirements.
• Created test cases for different scenarios.
• Manually tested and verified the new/upgraded Application by performing functional and
regression testing.
• Tested the compatibility of the application to support different web browsers and
supported versions on different platforms.
• Also involved in testing the functionality and usability of the system end user perspective.
• Developed traceability matrix between requirements and test cases.
• Conducted database testing by extensive use of SQL.
• Reviewed, analyzed, manually compared, documented and communicated test results.
• Participated in bug tracking and reporting.
• Involved in Creating Vuser script using Load runner
Description:
Avexus offers asset management, maintenance, repair and overhaul (MRO) solutions for the
aviation industry. These solutions were developed in Oracle 10g (Oracle Forms and Reports) and
Oracle database. They needed to convert Oracle Forms to Open source Java to reduce the
licensing costs to the customers.
As part of the team, I was involved in the project as a QA Tester in the major effort to migrate all
current forms developed in Oracle 10g to open source Java & J2EE Technologies. This
transformation process should ensure that the existing business logic and functionality are
retained. The new user interface is to be developed in Flex 1.5.Back-end is transformed into
J2EE Server side components interacting with Oracle Database (with a vision to make the system
Database independent in the future). The transformed system is deployed on the existing
environment at AVEXUS.
Responsibilities:
• Developed Test Cases, Test strategies and Test Summaries.
• Interacted and worked with the development team to solve the problems related to
testing.
• Compared Java forms with oracle forms manually.
• Performed system testing and GUI testing.
• Performed extensive Regression testing for subsequent versions of the application using
Win Runner.
• Generated SQL Statements to extract Data from Database and verified the data integrity.
• Participated in defect review meetings and test case review meetings.
• Reported the defects in the defect log using the Test Director.
• Worked closely with the client product development team to ensure high quality releases
of converted Forms to Java applications. These groups include Business analysts, end
users, and application development team members.
Client FLEET BANK, Jersey City, NJ
Project Online bill processing & E-payments
Role QA Analyst
Duration Aug’ 04- Jan'06
Environment J2EE, Java, Exodus Tool, Flex, Oracle 10g (Forms & Database) Win
Runner and Test Director, Load Runner. DML SQL
Description:
Involved in System Testing of ‘Account Access and Online Banking' web application, which
provides its future and existing customers with information about mortgages & loans, credit card
application, products & services, investments, wealth mgmt, brokerage services. It allows the
customer to access the accounts and administer their various Checking, Saving and Money
Market Accounts. It provides the ‘Account Summary' (balances, last deposits, transaction history
etc. Have sound experience of manually testing mainframe & open system applications.
Responsibilities:
Analyzed business requirements document and developed test plan, test objectives, test
strategies, test priorities, test cases etc.
Prepared pre-conditioned data for testing the application
Involved in functional and regression testing using Win Runner
Involved in Creating Vuser script using Loadrunner.
Monitor the various monitors. To know the bottleneck of the application.
Involved in testing the payments module of the application, this module allows user to
maintain their payments
Participated in the verification of transactions using predefined conditions and algorithms.
Executed SQL commands to do the Backend Testing
Executed the Test Cases Manually and reported the defects using Test Director
Developed manual Test Scripts and Test Cases
Developed and enhanced Load Runner HTTP VBScripts to simulate various business
transactions
Utilized LOADRUNNER scheduler to create various real time load scenarios
Logged Tickets using Rational Clear Quest
Involved in system integration and validation of populated data
Prepared test data for positive and negative test scenarios for functional testing as
documented in the test plan
Involved in System Test Planning using Task Distribution Diagram, Transaction Profile
Documents
Description:
Oxford Instruments mainly produced large super conducting Magnets. I was responsible for &
involved in Testing Hardware & software in quality assurance department. Help create Test Plan,
Test Conditions & Test cases from Software Requirement Specification & Functional Design
Specification, provided by Business Analyst.
Responsibilities:
Involved in Manual and Automated Testing of the application.
Involved in Development and Execution of Test Plans, Test Cases & Test Scripts by
reviewing the business requirements document, and technical specifications document.
Documented Test cases corresponding to business rules and other operating conditions
Tested Unix Shell Scripts developed for batch processing.
Involved in developing UAT Test Plans.
Involved in the White Box Testing for testing the functionality of the code.
Performing functional and Black Box Testing.
Developed Test scripts in TSL using QTP
Conducted browser compatibility test of application
Developed Test Cases and Test Scripts in Test Director
Conducted Black Box Testing (Functional Testing & Regression Testing) using Win runner.
Using Win runner to capture, verify and replay user interactions automatically to identify
defects.
Created basic scripts from a manual test case, add verification steps to tests, manage objects
in the object repository, and customize checkpoints with parameters and Data Table formulae.
Used SQL Queries for data verification.
Editing of automated scripts by inserting logical commands to handle complicated test
scenarios.
Maintained detailed test logs and report test results in Mercury Test Director.
Developed Traceability matrix between requirements & Test cases.
Education: