You are on page 1of 7

Abdul Shahnewaz

Tel: 202-468-5461
E-mail: abdul2425@yahoo.com
Work Status: US Citizen
Security Clearance: Active Public Trust (6c)

Professional Skills:

 Over Ten (10+) years of experience in the Software Testing/Quality Assurance with master’s in
information systems and diversified experience in Manual, Automated testing of Web and
Client/Server applications on Unix /Windows environment.
 Experienced with FEMA’s current systems as well as their fundamental technologies.
• Experienced with Software Development Life Cycle (SDLC), Software Engineering Life Cycle (SELC)
and Information Technology Lifecycle management (ITLM).
• Experienced in developing new IT security.
• Experienced in writing DTP and DTAR.
• Experienced in specifying, evaluating, reviewing, and analyzing systems.
• Involved in planning, analyzing, developing, implementing, maintaining and enhancing of
application.
• Proficient in creating queries in TFS to validate results of TFS reports.
• Proficient in creating Test Plan, TAR, LOE, developing and executing Test Cases, Test Scenarios.
• Extensive experience in Manual, Functional, Integration, Regression, Database, User Acceptance
Testing (UAT).
• Experienced in QTP/UFT, Quality Center (HP ALM), Test Director, WinRunner, PVCS Tracker, Team
Foundation Server.
• Utilized the expertise in VB script to develop QTP/UFT scripts.
• Designed Data Driven and Hybrid Automation Framework using QTP/UFT.
• Experienced in Testing Database Applications of RDBMS in ORACLE and SQL Server.
• Experienced in developing web pages using HTML, DHTML, Photoshop, and Flash.
• Performed 508 accessibility compliance by using Jaws.
• Performed CMMI (capacity maturity model integration) process.
• Excellent Organizational ability, Communication skill, Analytical skills, and problem-solving skills.
• Excellent technical documentation and reporting skills.
• Skilled in adapting to rapidly changing work environments. 
• Involved with CMMI level-3
• Ability to convey technical concepts to non-technical audiences.
• Experienced in independently providing various information to technology customer services such as:
providing user account support and application support
• Performed systems administration and troubleshooting support; addressing, investigating, and
resolving workstation.

EDUCATION COMPLETED:

• Master of Science (Software Engineering and Management)


• Strayer University, Alexandria, VA Year 2012

• Bachelor of Science (Computer and information System)


• University of Maryland, College Park, MD Year 2005
TECHNICAL SKILLS:

 Platforms: Windows 7 / Vista / XP, MAC OS X


 Mobile Platforms: iOS, Android OS, Windows Phone
 Programming: Java, Ruby, HTML, CSS, XML, JavaScript, SQL
 Database: Oracle
 Test Automation: Selenium Web Driver, Cucumber, Calabash
 Simulator: Xcode, Android Virtual Devices
 Networking: TCP/IP, FTP, HTTP
 Virtualization: VMware Server, Oracle VirtualBox
 Bug Tracking: JIRA, Bugzilla, Team Foundation
 Browsers: MS Internet Explorer, Mozilla Firefox, Chrome, Safari, Opera,Edge
 Applications: MS Office, Open Office
 Design Tools: Adobe Photoshop / Illustrator / InDesign / Acrobat Pro, CorelDraw

TRAINING:

• SECTION 508 COMPLIANCE : Introduction to Section 508,Web Accessibility Testing Process


and Tools, Web Accessibility Testing (WAT), ACCVerify (Automation Tools), and Screen Reader
JAWS (Feb,2019)
• ITIL (Information Technology Infrastructure Library)
• SharePoint-06/21/11 and 06/23/11

PROFESSIONAL EXPERIENCES:

Federal Emergency Management Agency November 2019 - Present


Software Test Engineer
Contactor- Techno gems, Inc.

FEMA (Federal Emergency Management Agency) mission is to support the citizens and first responders
to promote that as a nation we work together to build, sustain, and improve our capability to prepare for,
protect against, respond to, recover from, and mitigate all hazards.
RESPONSIBILITIES:
• Employed agile methodologies, especially SCRUM, to ensure rapid iterative software development.
• Responsible for directing the activities of the Quality Engineering function with an emphasis on
application quality, on-time delivery, and application efficiency.
• Lead cross-functional problem-solving teams and implement quality improvement tools and
techniques using engineered tools, quality tools, and preventive and corrective actions to overcome
barriers to continuous quality improvements.
• Ensured proper internal communication is conducted for required application information consistently
and effectively.
• Demonstrated ability to effectively implement continuous improvement initiatives.
• Developed and managed the production systems within the application line by establishing
requirements, conduct, and measurement.
• Developed and/or contributed to product FEMA, control plans, customer acceptance,
documentation, and other methods.
• Performed other duties and special projects as assigned.
• Lead efforts in achieving all customer Quality System Requirements.

Environment: Windows Server, Oracle Database, Selenium WebDriver, SQL, Java, ANT, , SoapUI,
REST, SOAP, Jira, TestNG, Agile, Windows, UNIX.

United States Department of State October 2018- October


2019 Software Test Engineer Contactor- Acuity Inc.

The Foreign Affairs Network (FAN), developed under the Department's Bureau of Information Resource
Management (IRM), is a portfolio of secure, cloud-based services that enable a more mobile, productive,
and collaborative workforce. With FAN, the foreign affairs community can leverage modern cloud services
that have standardized security and management frameworks and streamlined procurement options.
RESPONSIBILITIES:
• Analyzed the requirements and provided the right effort estimates during the estimation phase Poker
Planning.
• Involved in System Integration testing, Multi Browser testing, Production Validation testing,
Assembly testing, and Database testing.
• Used Selenium WebDriver APIs for writing Junit test suites and test cases for functional testing.
• Performed defect tracking and management, interfacing with development teams, configuration
management team, RCA team, environment team, UAT team, and other application teams.
• Involved in Test Result analysis, Defect Management, and Risk analysis.
• Introduced and implemented Open-Source web test tool Selenium and WebDriver for cross browser
testing.
• Participated in the User Stories discussion and test scenarios preparation.
• Developed SQL Scripts for Backend Testing to ensure that the data is updated as per the Business
Rules.
• Coordinated with developers and record defects in JIRA to track until they are resolved.
• Prepared user documentation with screenshots for UAT User Acceptance testing.
• Prepared of data requirements for each Sprint release and preparation of test cases design and
review and publishing during the demo.
• Provided strategies for reusability of data requirements which helped in effort saving.

Environment: Windows Server, TCP/IP, UFT, Oracle Database, Selenium WebDriver, RC, SQL, Java,
ANT, JUnit, SoapUI, REST, SOAP, Jira, JMeter, Agile, Windows, UNIX.

United States Dept. of Health & Human Services April 2018- September 2018
Software Test Engineer Contactor- Futrend Technology, Inc.

The Federal Select Agent Program (FSAP) regulates the possession, use, and transfer of biological
select agents and toxins that have the potential to pose a severe threat to public, animal or plant health,
or to animal or plant products. Common examples of select agents and toxins include the organisms that
cause anthrax, smallpox, and bubonic plague, as well as the toxin ricin.

RESPONSIBILITIES:
• Created Test Automation Framework with Cucumber and Selenium WebDriver.
• Performed defect tracking and management, interfacing with development teams, configuration
management team, RCA team, environment team, UAT team, and other application teams.
• Involved in Test Result analysis, Defect Management, and Risk analysis.
• Generated the User scripts by using the Virtual user Generator.
• Introduced and implemented Open-Source web test tool Selenium and WebDriver for cross browser
testing.
• Participated in the User Stories discussion and test scenarios preparation.
• Developed SQL Scripts for Backend Testing to ensure that the data is updated as per the Business
Rules.
• Coordinated with developers and record defects in JIRA to track until they are resolved.
• Performed the tasks of validating software by defining automated testing components.
• Handled the tasks of preparing defect reports, developing automation strategies, and perform
automated testing of back-end systems.
• Assigned responsibilities of developing functional testing automation scripts and conducting defect
root cause analysis.
• Handled responsibilities of executing test strategy, writing test scripts, and creating automated GUI.
• Assigned the tasks of maintaining usage logs, performing test suite, and gathering technical
requirements to meet project requirements.
• Performed responsibilities of defect identification, writing test procedures, and reporting metrics.

Environment: Windows Server, TCP/IP, UFT, Oracle Database, Selenium WebDriver, RC, SQL, Java,
ANT, JUnit, SoapUI, REST, SOAP, Jira, JMeter, Agile, Windows, UNIX.

United States Census Bureau February 2016- March 2018


Software Test Engineer Contactor- Accenture Federal Services

The U.S. Census counts every resident in the United States. To get an accurate count, the Census
Bureau must deliver a complete and accurate address list and spatial database for enumeration and
determine the type and address characteristics for each living quarter. Collect response data via the
Internet to reduce paper and Non-response Follow up and maximize online response to the 2020 Census
via contact strategies and improved access for respondents

RESPONSIBILITIES:
• Interfaced with Product Owner, stakeholders, Scrum Masters, and Developers to fully understand
user story requirements and testing needs.
• Provided functional testing capabilities within an Agile environment using SCRUM.
• Developed manual test cases and automation script for Mobile Native applications (IOS, Android)
using Appium with Selenium Webdriver.
• Utilized HP Functional Test Suite (Unified Functional Testing v.12.5, Application Lifecycle
Management (ALM),) for test Automation.
• Created LoadRunner scenarios and scheduled the Virtual Users to generate realistic load on the
server using LoadRunner (Load generator machine).
• Developed complex SQL queries during End-to-End testing stage.
• Created and executed test cases that validate the acceptance criteria of each user story and support
the team's definition of completion.
• Analyzed and interpreted requirement specifications.
• Involved in developing System Requirements, procedures, support and maintain Test Environments.
• Performed installing new build in all test environments and configure them.
• Performed Regression, Integration, and Smoke Test.
• Worked closely with software developers, business analysts, System Admin and other project
management personnel involved in Software Development Life Cycle Team.
• Performed back-end testing using Oracle Database
• Created Test Users with privileges for testing environment.
• Performed reports testing for PEGA application
• Assessed customer’s needs and identified products to meet their needs.
• Involved in the team meetings with representatives from Development, Database Management,
Configuration Management, and Requirements Management to identify and correct defects.
• Performed 508 compliant verification testing and reviewing technical requirements of software.
• Documented plans and guidelines to provide support in 508 testing activities.

Environment: Windows Server, TCP/IP, UFT, Oracle Database, Air Watch, Citrix, and Windows, Client
ALM, PEGA, QR Reader, VPN, and Redmine

United States Department of Labor April 2013- January 2016


Quality Assurance Analyst Contactor- Definitive Logic

The Departmental E-Budgeting System (DEBS) is an integrated budget environment that blends a set of
COTS and GOTS solutions to optimize resources throughout the budget formulation lifecycle. The set of
tools and techniques associated with DEBS permits users to track, spread, and report and analyze
budget and performance data within agencies and across Department of Labor for greater transparency.
RESPONSIBILITIES:
• Developed and documented quality assurance processes and software testing approaches.
• Developed and documented system test plans.
• Developed and maintained black box software testing suites.
• Developed and maintained automated regression test cases in Selenium WebDriver using Java
programming language.
• Selenium IDE used to test accessibility of all the Webpages.
• Executed automated Selenium scripts and reproduced failures manually.
• Managed the configuration of testing suites across multiple environments.
• Managed testing environments and automated testing processes.
• Perform integration, system and regression testing.
• Execution of automated test scripts using JMeter based on business or functional specifications.
• Worked with the Information Security team to integrate systems security testing into the test plans.
• Worked with the customer to execute user acceptance testing.
• Developed software release metrics and present software demonstrations to the customers.
• Documented test results and communicate those results to the customer.
• Document system defects, approaches and resolutions in the team tracking system.

Environment: Windows Server, TCP/IP, Selenium WebDriver, Test Explorer, ASP, ASP.NET, VB.NET,
J2EE, VB, Oracle, Version Manager (VM), Citrix, and Windows

United States Citizenship & Immigration Service November 2011-March 2013


Software Test Engineer Contactor- EIS (Enterprise Information Services)

CLAIMS (Computer Linked Application Management System) 3 LAN (Local Area Network) is an umbrella
system that incorporates casework-oriented software subsystems and supports the USCIS application
receipt, adjudication, and notification processes. CLAIMS 3 also provide automated support for the full
range of benefits functions and processes. CLAIMS 3 reside on two platforms: LAN (Local Area Network)
and Mainframe system.

RESPONSIBILITIES:
• Utilized HP Functional Test Suite (Unified Functional Testing, Application Lifecycle Management
(ALM),) for test Automation.
• Ran the automated QTP scripts in different environments whiles the same application in different
environments (Unit, FQT).
• Involved with debugging testing with Automation(QTP) Script
• Developed complex SQL queries during End-to-End testing stage.
• Responsible for conducting test of software components by coordinating with product engineers.
• Handled the tasks of interacting with engineers in performing automated testing and in developing
testing tools.
• Performed the tasks of validating software by defining automated testing components.
• Handled the tasks of preparing defect reports, developing automation strategies, and perform
automated testing of back end systems.
• Assigned responsibilities of developing functional testing automation scripts and conducting defect
root cause analysis.
• Handled responsibilities of executing test strategy, writing test scripts, and creating automated GUI.
• Assigned the tasks of maintaining usage logs, performing test suite, and gathering technical
requirements to meet project requirements.
• Performed responsibilities of defect identification, writing test procedures, and reporting metrics.
Environment: Windows Server, TCP/IP, UFT, ALM, Test Explorer, ASP, ASP.NET, VB.NET, C/C++,
C#, J2EE, VB, Pervasive Database, ITDL, Version Manager (VM), Citrix, and Windows

United States Citizenship & Immigration Service August 2007-October 2011


Software Test Engineer
Contactor- CSC (Computer Sciences Corporation), currently known as CSRA

CLAIMS (Computer Linked Application Management System) 3 LAN (Local Area Network) is an umbrella
system that incorporates casework-oriented software subsystems and supports the USCIS application
receipt, adjudication, and notification processes. CLAIMS 3 also provide automated support for the full
range of benefits functions and processes. CLAIMS 3 reside on two platforms: LAN (Local Area Network)
and Mainframe system.

RESPONSIBILITIES:
• Reviewed formal deliverable documents and traced requirements throughout the software
development life cycle.
• Understood and interpreted requirement specifications.
• Involved in Develop System Requirements, procedure, support and maintain Test Environments
• Performed to review Statements of Work (SOW), System Requirements Document (SRD),
Requirements Traceability Matrix (RTM), System Design Document (SDD), and Version Description
document (VDD), Project Management Plan, and Reference Manuals.
• Involved to get requirements (SCR-System Change Request),Analyze them, design Test Case for
individual Release
• Responsible for application features and preparing several Development Test plans (DTP) and
Development Test Analysis Report(DTAR)
• Developed Master Test Case to reuse when new release comes
• Developed Master Test Plan and Test Analysis Report
• Responsible to send all documents for Peer Review (Internal and external) and Update document
base on Peer Review checklist.
• Responsible in update documents in Serena VM (version Manager)
• Responsible to writing Test Cases for Integration Test, Functional Qualification Test (FQT), Positive
Test, Negative Test, Boundary Test, Conversion Test, User Acceptance Test (UAT), and Regression
Test.
• Responsible for Create and update FQT Status Report and Upload in Network Drive and SharePoint
• Exposed with QA Methodology to ensure good Quality Control.
• Involved in Unit Test with Developer to maintain to the SCR specification
• Performed to update defects as Test Problem Report (TPR) in Serena TRACKER.
• Performed to installed new build in all test environments and configured them.
• Create baseline verification in Automation (QTP) and manual (before and after) new build
• Performed Regression, Integration, Data driven testing using Mercury Interactive Quick Test
Professional (QTP).
• Created User-defined functions, functions libraries and maintained the initialization scripts to set-up
the work environment using QTP.
• Created Linear Scripting using QTP.
• Enhanced the scripts in QTP by applying checkpoints, parameterizations, synchronization point, data
driven tests.
• Created QTP Script in Data Driven Framework, Modular Framework, Keyword Driven Framework and
run scripts and Create PDF Report.
• Ran the automated QTP scripts in different environments whiles the same application in different
environments (Unit, FQT).
• Involved with debugging testing with Automation(QTP) Script
• Worked closely with software developers, business analysts, System Admin, CM Team and other
project management personnel involved in Software Development Life Cycle Team.
• Performed back-end testing using Pervasive Database
• Created Testing User with privileges for Testing environment
• Performed to convert documents from System Development Life Cycle (SDLC) to System Engineer
Life Cycle (SELC)
• Assessed customers' needs and identifies products to meet their needs.
• Involved in the team meetings with representatives from Development, Database Management,
Configuration Management, and Requirements Management to identify and correct defects.
• Assigned responsibilities of developing functional testing automation scripts and conducting defect
root cause analysis.
• Handled responsibilities of executing test strategy, writing test scripts, and creating automated GUI.
• Assigned the tasks of maintaining usage logs, performing test suite, and gathering technical
requirements to meet project requirements.
• Performed responsibilities of defect identification, writing test procedures, and reporting metrics.
• Used Team Foundation Server to open bugs, create tasks, and user stories.
• Tested multiple content controls that are created and incorporated in to TFS work item types.

Environment: Windows Server, TCP/IP, QTP, Test Explorer, ASP, ASP.NET, VB.NET, C/C++, C#,
J2EE, VB, Pervasive Database, ITDL, Serena TRACKER, Version Manager (VM), Citrix, UNIX, and
Windows

The Space & Naval Warfare Systems (SPAWAR) March 2006 –July 2007
Quality Assurance Engineer Contactor- Bruhn-NewTech

CBRN-E is a risk management software system for hazards arising from the use of Chemical, Biological,
Radiological/Nuclear, and Explosive weapons (CBRN-E) and Releases Other Than Attack (ROTA) of
radiological material and Toxic Industrial Materials (TIM.)

RESPONSIBILITIES:
• Developed Test Plan, Test Scenarios and Test Cases from business, technical and functional
Requirement.
• Performed Requirement Assessments for the projects.
• Created Requirement Traceability Matrix, TAR, Executive Summary and LOE for the project.
• Performed Manual Testing to check the functionalities of the entire application.
• Used different browsers in different platform conduct User Acceptance Test.
• Performed Data Driven testing using QTP.
• Performed Functional, Regression, Integration and End to End testing using QTP.
• Tested the functionality of most commonly used panels, using QTP, created and added logic to the
script with conditional statements, loops and arithmetic operators to create more powerful and
complex test.
• Used Quality Center for storing requirements, creating & executing test cases, defect tracking, and
the complete test management.
• Coordinated and monitored the process of testing of defects, bugs, and releases and monitored the
smooth transition of them into production.
• Tested all the applications under different Operating Systems.
• Performed 508 Compliance testing using JAWS as a screen reader.
• Checked compliance of the application compliance as defined by its effectiveness in fulfilling the
requirements of Section 508.
• Identified software errors and interacted with developers to resolve technical issues.

Environment: Windows Server, UNIX, TCP/IP, QTP, Quality Center, Test Explorer, ASP, ASP.NET,
VB.NET, C/C++, C#, COM/DCOM, J2EE,Web Sphere, HTML, DHTML, VB.

You might also like