Professional Documents
Culture Documents
com
ETL TESTING +91-8861629245
PROFESSIONAL SUMMARY:
Overall 3 Years of Experience in IT Industry, worked on ETL Testing and DWH Testing,
Snowflake database Testing.
IT Industry experience in mortgage, Retail, Automotive with strong Business Knowledge.
Having experience on creating databases, tables and views in MySQL and Snowflake.
Experience in working with Agile Methodology and expertise in Defect management and Bug
Reporting Tools like HP ALM, JIRA
Understand project requirements (Base tables, Driver tables, Package tables, business rules)
key outcomes.
Experience in working with Agile Methodologies and responsible for Sprint and Agile Scrum
status meeting updates and expertise Defect Tracking, Defect management and Bug Reporting
Tools like Jira.
Experience in Testing Data Warehouse/ETL Applications developed in Informatica 9.6.1
Using Oracle.
Extensive experience in developing SQL scripts to validate the databases tables and reports
data for backend database testing in Oracle/SQL Server.
Good Experience on UNIX commands.
Well acquainted with QA methodology, Software Development Life Cycle and Software Test
Life Cycle including Requirement management, risk management, Test Strategy preparation,
Test Plan development, Test Scenario design, Test Case creation, Test execution, Test
automation and Bug Tracking, Result Analysis, Preparing Status reports in each
Sprint/Iteration.
Extensively interacted with developers, business& management teams to understand the
project business requirements and ETL design document specifications.
Experience in analyzing the error caused to failure of the ETL load with reference to the log
file and report them to the corresponding team to get it rectified.
Experience in workflow manager, workflow monitor for Informatica ETL jobs.
TECHNICAL SKILLS:
Testing : ETL/DWH Testing
Databases : Oracle 10g,Snowflake
Programming Language : SQL, MYSQL
Test management Tool : QC (ALM), JIRA
ETL Tools : Informatica Power Centre
Operating system : Windows, UNIX.
EDUCATION:
Diploma in Electrical and Electronics Engineering (EEE) from Government Polytechnic
Belagavi in 2017.
WORK EXPERIENCE:
Currently deputed and working as Software Test Engineer for Thermo Fisher
Scientific Bangalore from June-2021 to till Date.
Deputed and Worked as Associate Test Engineer for Neiman Marcus Group
Bangalore from August- 2019 to May-2021.
PROJECT DETAILS:
Project # 1
Project : ThermoFisher Scientific
Client : ThermoFisher Scientific
Database : Oracle 10g
Role : Test Engineer (ETL)
Tools : MySQL, Informatica, Unix Server, Jira, Confluence.
Description :
Data arrives from the different source in various different formats; the data needs to be transformed to
a consistent format and then updated on a regular basis so that the client can accurately forecast its
liabilities.
For many of the scientific instruments and medical instruments delivered/purchased by customers, this
is a relatively straightforward task, but for product development which are either more complex, or
have not been updated for several years, significantly more work is required to update the data and
investigate any changes in the resulting reserves.
RESPONSIBILITIES:
Involved in understanding the business requirements
Involved in running the Jobs for ETL process
Designed, Reviewed & Executed Test cases
Prepared and ran SQL queries for data validation
Verifying the data in target database after ETL process
Prepared test data for testing
Interactions with BA & Dev teams to resolve the issues.
Defect analysis and reporting in JIRA Tool.
Involved in daily standup meetings.
Project # 2
Project : Neiman Marcus Group
Client : Neiman Marcus
Database : Oracle 10g
Role : ETL Test Engineer
Tools : Snowflake, MySQL, AWS S3, Jenkins, UNIX server, Jira, Bit bucket, Confluence.
Description : NMG mainly they are decommissioning the MySQL servers, hence Data migrating from
MySQL to Snowflake database through Jenkins jobs scheduled which is incremental data and earlier
MySQL uses SQL stored procedures to load data and now NMG has implemented the jobs schedule
through the Jenkin to load the data to Snowflake DB. They are extracting data from source tables and
transform data as per business logic and load into Snowflake tables and finally save as orc file in S3.
RESPONSIBILITIES:
DECLARATION:
I hereby declare that all information written above is true to the best of my knowledge.