You are on page 1of 7

Vikram Reddy

Email: vikram9423@gmail.com Cell: 937 430 8156

Summary:

 9 years of Experience working for Data testing in Data Migration, Data Extraction, Data Cleansing,
Data Profiling and Data Quality for various business data feeds.
 In Depth knowledge in Data Warehousing concepts with emphasis on ETL and life cycle
development includes requirement analysis, design, development, testing and implementation
 Well versed with various types Software Development methodologies- Water fall, Agile, Iterative and
extreme Programming.
 Experience in Software QA and Testing Methodologies, verification and validations in all phases of
the SDLC.
 Solid Experience on tools like Quick Test Pro (QTP), Quality Center, SharePoint and Clear Quest.
 Experience in Writing automation scripts for various s QTP Add-ins (Web, Java, Visual Basic, Flex,)
and integrated the scripts with Test Director/Quality Center to manage the entire Automation
testing process
 Extensive Experience with Utility Tools like TOAD, SQL Server Management Studio, SQL Developer,
SQL* Plus
 In depth technical knowledge and understanding of Data Warehousing, Data Validations, OLAP, SQL
Server, Oracle and ETL.
 Strong working experience in the Data Testing of Data Warehousing using Data Conversions, Data
Extraction, Data Transformation and Data Loading (ETL)
 Experienced in DW projects ETL testing against DW testing with Informatica, Ab Initio and data stage
tools
 Tested Cognos reports to check whether they are generated as per the company standards
 Extensive experience working with Data warehouse ETL tools like Data Stage, Informatica, AbInitio,
SSIS and AutoSys tools.
 Experience in the development of data warehouse, data mart and transactional models.
 Good Knowledge on Business intelligence, OLAP, Dimensional modeling, Star and Snowflake
schema, extraction, transformation and loading (ETL) process
 Experienced in HP Quick Test Professional, Quality Center, ALM, Load Runner, and JIRA
 Strong technical Knowledge of UNIX Utilities, shell Script to automate process.
 Ability to develop complicated SQL script for Data validation testing by running SQL script,
procedures.
 Extensive experience in manual testing the ETL and Business Intelligence Applications
 Expert in review and analyze requirements, use cases, technical specifications and database designs.
 Expert in Develop Master Test Plans, Test Designs, Test estimations, Specifications, Test Cases, Test
Scenarios, Requirement traceability matrix, Test closure summary and Production checkout
documents.
 Experienced in documenting defects, generate defect and test progress graphs with overall test
summary/closure summary document in Quality Center
 Excellent written and oral communication skills and a team-player with a results-oriented attitude.
Technical Skills:

Business Modeling Tools: MS Visio, Power Designer


Database Management: DB2, Oracle 9i/10g / 11g, SQL Server 2000/2005 / 2012, Teradata
V2R6/V2R5 / 13
Data Warehouse Tool: Informatica Power Center, IBM Data Stage,
Functional Test Tools: Win Runner 8.0, QTP 9.x
Languages: C, C++, PL/SQL, T-SQL, SQL, Visual Basic 6.0, SAS 9.1
Office Tools: MS Office Suite, MS Project
Operating System: IBM AIX, UNIX, Windows
Rational Tools: Requisite Pro, Clear Case, Clear Quest, Manual Tester, DRAT Tool
Reporting Tools: Cognos, Business Objects, Crystal Reports
Tools: IMS, XMLSpy, TOAD, SQL Enterprise Manager, SQL Query Analyzer,
Win SQL, Citrix, Control Center
Web Technologies: XML, XSL, XHTML, HTML, CSS, JavaScript, VBScript

Professional Experience:

Premier Inc., Charlotte, NC Oct 14 – Present


Sr.ETL QA Tester
Responsibilities:

 Worked with multiple client specific data extract process involving source systems like
Cerner, All scripts, Epic and Paragon.
 Handled staging process for Billing, Clinical and Claims related data.
 Involved in gathering clients requirements, Product requirements, developed mapping
documents based on the Product design.
 Actively involved and provided inputs while working with multiple teams from inception to
product delivery.
 Experience in Java Scripting with Hadoop, analyzing huge datasets & parsing CSV files.
 Validated logic for handling slowly changing dimension (Type1 and Type 2) table load by
flagging the record using update strategy for populating the desired.
 Involved in Functional testing, System integration Testing, to check whether the data is
loading into target, which was extracted from different source systems according to the user
requirements.
 Validated and signed off on various loads like Daily Loads, Weekly Loads, and Quarterly
Loads using Incremental Loading Strategy.
 Validated loads in production to ensure full data set is loaded into all the tables as part of
Production Support.
 Worked closely with customers during User Acceptance testing.
 Generated mock data to support the test cases developed.
 Identified opportunities for business process improvement through various meetings with
business users and developers and initiated efforts to make improvements.
 Prepared and executed Manual Test scripts and was responsible to track and log the defects
using JIRA.
 Involved in daily SCRUM meetings to discuss the development/progress of Sprints and was
active being scrum master making scrum meetings more productive.
 Wrote, executed and performance tuned SQL Queries for Data Analysis & Profiling.
 Worked on the BI Reports generated by Cognos and Tableau.
 Validated data flow within staging layers.
 Automated end-to-end, manual test scripts using DB Fitness.

Environment: Fitness Automation Suite, IBM Data stage v8.7, IBM Netezza 7.0.2.11, Aginity, workbench,
Linux 2.6.32, Autosys, Hadoop, Tortoise GIT, JIRA
MVP HealthCare Feb 13 – Sep 14
Sr. ETL QA Tester Responsibilities:

 Worked on Requirement Analysis, Data Analysis, Testing of various source systems sitting and
coming from multi systems. Responsible for BI Data Quality.
 Performed Business Process mapping for new requirements.
 Performed Data Validation with Data profiling
 Tested reports in Access, Excel using advanced functions not limited to vlookup, pivot tables,
formula
 Experience in using Java Scripting for Big data analysis with Hadoop to MapReduce Hadoop
framework for analyzing huge datasets & parsing CSV files
 Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices.
 Worked with ETL group for understanding Data Stage graphs for dimensions and facts.
 Involved in validating SSIS and SSRS packages according to functional requirements.
 Worked on executed SAS programs in UNIX
 Tested several complex reports generated by Cognos including Dashboard, Summary Reports,
Master Detailed, Drill Down and Score Cards
 Performed integration, regression, retesting.
 Compared SAS datasets using SAS PROC Compare
 Execution of existing automated test cases using QTP.
 Performed Manual Functional testing and Regression Testing with QTP.
 Creating complex data analysis queries to troubleshoot issues reported by users
 Validated SCD Type 1, Type 2 and Type 3 Tables.
 Different Data Stage components especially are used effectively to develop and maintain the
database
 Involved in creating and maintaining Defect Reports using ALM 11
 Evaluates data mining request requirements and help develop the queries for the requests.
 Participated in sessions with management, SME, vendors, users and other stakeholders for open
and pending issues
 Used Toad to run SQL queries to check data is populating as expected from tables.
 Created test cases, executed test scripts and track and report system defects using Quality Center.
 Worked with data validation, constraints, source to target row counts.
 Tested various Jobs, scheduling & running jobs and troubleshooting the failed jobs.
 Tested several complex reports including Dashboard, Summary Reports, Master Detailed, Drill
Down and Score Cards.
 Writing complex SQL queries for data validation for verifying the SSIS Packages and business Rules
 Created various transformations like filter, router, lookups, stored procedure, joiner, update
strategy, expressions and aggregator to pipeline data to Data Warehouse/Data Marts and
monitored the Daily and Weekly Loads.
 Extraction, Transaction and Loading was done using different Components and different
expressions using Data Stage to create test data
 Report defects in JIRA, worked closely with analysts, developers to resolve the problems, and
change the status of the defect accordingly
 Involved in backend testing for the front end of the application using SQL Queries in Teradata data
base.
 Extensively tested several Cognos reports for data quality, fonts, headers & cosmetic
 Testing in Star Schema, Fact tables, Dimension tables.
 Customized complex reports in Excel using intricate formulas.
 Written several complex SQL queries for validating Cognos Reports.
 Worked with Data Profiler to obtain the statistics of data like null values, max, min, average and
analyzed the sample data before setting ETL Mapping Document.

Environment: SQL SERVER 2012 & 2008, Hadoop. SSIS ,Oracle 11g, IBM Data Stage 8.0,JIRA, Quality
center - ALM 11, Cognos 8.0 Series, TOAD, Teradata SQL Assistant, Teradata V2R6, XML Spy 2008, SQL,
SAS/Stat, SAS/Graph, SAS/ODS, SAS/SQL ,PL/SQL, T-SQL, UNIX

Matrix healthcare, phoenix AZ June 2011 – Jan 2013


ETL QA Tester

Responsibilities:

 Analyzed business requirements, system requirements, data mapping requirement specifications,


and responsible for documenting functional requirements and supplementary requirements in
Quality Center 10
 Tested ETL jobs as per business rules using ETL design document
 Assisted in creating fact and dimension table implementation in Star Schema model based on
requirements.
 Extensively used Data Stage for extraction, transformation and loading process.
Worked with ETL group for understating Data Stage.
 Tested different master detail, summary reports, ad-hoc reports and on demand reports using
Cognos Report Studio.
 Responsible for testing packages using SSIS (ETL) to verify data completeness, data transformation,
data quality, integration testing, UAT & regression testing.
 Tested and Automated SAS jobs running on a daily, weekly and monthly basis using Unix Shell
Scripting
 Expert in writing Complex SQL/PLSQL /T-SQL Scripts in querying Teradata and Oracle.
 Defined data requirements and elements used in XML transactions.
 Involved in the Design, Development, Testing phases of the Data warehouse
 Tested the database schema with help of data architects using ERWIN.
 Involved in the testing of Data Mart using Power Center.
 Identified and Documented additional data cleansing needs and consistent error patterns that
could diverted by modifying ETL code.
 Strong in writing UNIX shells scripting. Automated and scheduled the Data stage jobs using UNIX
Shell Scripting.
 Involved in testing the Cognos reports by writing complex SQL queries
 Executed the Test Scripts in TOAD and documented the results in RTM.
 Create new bugs and track status of bug using JIRA
 Leverage JIRA activity history to quickly access recently opened issues, and searches.
 Tested Teradata load utilities Fast load, Multiload and Fast Export to extract, transform and load
the Teradata data warehouse
 Writing complex SQL queries for data validation for verifying the SSIS Packages and business Rules
 Worked in an agile technology with Scrum.
 Responsible for different Data mapping activities from Source systems to Teradata.
 Queried Teradata Database and validated the data using SQL Assistant.
 Effectively distributed responsibilities, arranged meetings and communicated with team members
in all phases of the project.
 Used import and export facilities of the application to download/upload XMLs of failed test cases so
as to re-verify. And used SQL Loader for Oracle for import the flat file.
 Scheduled the jobs using Auto sys and automated the jobs to be ran at specific time and automated
the reports.
 Writing UNIX scripts to perform certain tasks and assisting developers with problems and SQL
optimization.
 Configured Quick Test Pro with Quality Centre and Maintained the project information in Quality
Centre.
 Extensively used Data Stage for extraction, transformation and loading process
 Extensively used Autosys for automation of scheduling jobs on daily, bi-weekly, weekly monthly
basis with proper dependencies.
 Wrote complex T- SQL, SQL queries using joins, sub queries and correlated sub queries
 Designed and developed UNIX shell scripts as part of the ETL process, automate the process of
loading, pulling the data.
 Involved in extensive DATA validation using SQL queries and back-end testing.
 Responsible for migrating the code changes from development environment to SIT, UAT and
Production environments.

Environment: : IBM Data Stage 8.0, SSIS ,Flat files, MS SQL Server 2008, Oracle 11g, SQL, PL/SQL, IBM
DB2, JIRA,AGILE, Teradata 13, Teradata SQL Assistant, Cognos 8.0 Series, HP QTP 9.0, HP Quality Center
10, Autosys, Toad, Unix Shell Scripting

Dean Health Plan, Madison WI May 2010 – June 2011


ETL QA Tester
Responsibilities:

Responsibilities:
 Obtained a detailed knowledge of the business process being followed in the project environment
 Extracted the business requirements from the end users keeping in mind their need for the
application, and documented it for the developers
 Extensively used JIRA for defect management and project management.
 Involved in extensive DATA validation using SQL queries and back-end testing
 Contributed to the initial data mining work and development of tools and technology.
 Perform Gap Analysis of the processes to identify and validate requirements
 Organize requirements into high level Use Cases and low level Use Case Specifications and modeled
them into UC, Activity and Sequence Diagrams using Rational Rose and MS Visio.
 Initiated detailed discussions/functional walkthroughs with stakeholders
 Produced system documents, functional requirements, ETL mapping and modeling deliverables
Worked with root cause of data inaccuracies and improved the quality of data by using Data Flux
 Used QTP for Functionality Testing, Regression Testing, Verification and Validation Testing.
 Involved in Hybrid Frame works and Weekly Automation Regression testing using QTP.
 Resolved conflicts between business and system teams
 Wrote complex SQL, PL/SQL Testing scripts for Backend Testing of the data warehouse application.
 Involved in Hybrid Frame works and Weekly Automation Regression testing using QTP.
 Used QTP for Functionality Testing, Regression Testing, Verification and Validation Testing.
 Assisted developers and testing teams to ensure that the requirements are communicated
accurately and that the system being developed is verifiable against the requirements
 Involved in Documenting/Executing business test case
 Performed backend database testing by writing SQL and PL/SQL scripts to verify data integrity
 Aided in the creation of a simple user manual for new users
 Responsible for resolving issues assigned from equities and fixed income.
 Implemented updates for metadata system using Excel and VB macros.
 Gathered detailed information and performed analysis on data elements in order to merge 4 legacy
LOB's into the data warehouse.
 Support the quality assurance effort with data analysis and execution of SQL queries for db2 tables
in UNIX environment for data validation and verification processes through multiple environments.
Working closely with QA team to confirm validation.
 Involved with backend testing by writing complex SQL queries.
 Understood Data Models, Data Schema, ETL and created extensive store procedures, SQL queries
to perform back-end data warehousing testing.

Environment: Agile, JIRA, Quality Center9.2, Cognos 8.3, MS Word, Rational Requisite Pro, Oracle 10G,
Java/J2EE, QTP

Merck, Philadelphia, PA Jan 2009 – March 2010


ETL QA Tester
Responsibilities:

 Extensively used T-SQL to verify and validate the data loaded to the SQL Server 2005
 Worked with T-SQL for validating the SQL Server 2000 data.
 Assisted in creating fact and dimension table implementation in Star Schema model based on
requirements.
 Worked with leadership team to analyze current SDLC process and recommended process
improvements.
 Metadata graphs from legacy source system to target database fields and involved in creating Ab
Initio DMLs
 Involved in Teradata SQL Development, Unit Testing and Performance Tuning
 Used TOAD Software for Querying ORACLE. And Used Teradata SQL Assistant for Querying Teradata
 Extraction, Transaction and Loading was done using different Components and different expressions
using Ab Initio to create test data.
 Worked on Quality Center for Tracking and Reporting Defects, which are assigned to developers.
Designed and developed Test Plans, Test Scripts and Test Cases.
 Solid testing experience in working with SQL Stored Procedures, triggers, views and worked with
performance tuning of complex SQL queries
 Worked on issues with Data migration from development to testing.
 The reports that were created in Business Objects were testing by running the SQL statements
against the tables in the Oracle Database
 Involved in user training sessions and assisting in UAT (User Acceptance Testing).
Environment:
Ab Initio, Teradata V2R5, (GDE 1.14, Co>Op 2.14), J2EE 1.4, HP Quality Center9.2, SSIS, SSRS, XML, SQL,
Oracle 9i/10g, IBM AIX 5.1, Business Objects XIR2/6.5, UNIX, Korn Shell Scripting, Windows, Influx and
Test Director 8

Mayo Clinic, Rochester, MN Nov 2007 – Dec 2008


SQL Tester
Responsibilities:
 Designed and developed UNIX shell scripts as part of the ETL process to automate the process of
loading, pulling the data for testing ETL loads.
 Developed and Tested UNIX shell scripts as part of the ETL process, automate the process of loading,
pulling the data.
 Written several complex PL/SQL statements for various business scenarios.
 Loaded data from operational data store (ODS) to data warehouse tables by writing and executing
foreign key validation programs to validate where exactly star schema appears, with fact tables and
dimensions/lookup tables.
 Writing Triggers enforcing Integrity constraints, Stored Procedures for Complex mappings, and
cursors for data extraction.
 Worked extensively with mappings using expressions, aggregators, filters, lookup and procedures to
develop and feed Data Mart.
 Did data parsing, text processing and connecting to the database using PERL.
 Developed UNIX Shell scripts to automate repetitive database processes
 Tested several ETL routines and procedures.
 Identify the primary key (logical / physical ) and put update or insert logic
 Deleting the target data before processing based on logical or physical primary key
 Design and execute the test cases on the application as per company standards

Environment: Oracle 8i, SQL*Plus, SQL, Test Director, SQL Server 2005, T-SQL, SQL, PL/SQL, Visual Basic
6.0, Windows 95, XML, XSLT, XSD, UNIX, Korn shell scripting

You might also like