You are on page 1of 4

Sunil +91 9691792494

Professional Summary

 Accomplished professional offering nearly 5 years of experience in Development and Analysis (,


Snowflake cloud database, AWS , SQL, Python , Teradata ,UNIX, Informatica power center and Data
Warehousing).
 Currently working as DATA Engineer in Bangalore
 Experience in Designing, Support and strong knowledge of Data warehousing Concepts, Business
 Intelligence and Reporting Services.
 Expertise in end-to-end implementation of various projects including the designing, development,
coding, testing.
 Build Data Platform in Snowflake using python framework as an ETL tool to built Enterprise
Reporting Data Warehouse on Snowflake.
 Involved in migrating TB's of data from Teradata/redshift to Snowflake.
 Experience writing complex SQL & SnowSQL.
 Competency in end-to-end project management and ensuring effective management of various
resources to meet project specifications
 Excellent interpersonal and communication skills, and is experienced in working with business people
and developers across multiple disciplines

Key Technical and Domain skills

 4.6years of total experience in IT Industry.


 2.5 year of experience in snowflake, python, AWS as cloud data engineer
 4.6 year of experience in ETL Development.

IT SKILLS

 ETL tool :- Informatica pwc, API ,Data warehousing , informatica cloud


 Cloud:- AWS(s3,ec2,iam,Lambda) , snowflake
 Databases: - Snowflake database(cloud), Teradata, SQL , Hive
 Programing language: - Python
 Tools: - Putty, PyCharm, Git, Jira ,bitbucket
 Scheduler: - Autosys, JAMS, Control- M
 Reporting Tool:- MicroStrategy
Work History:

Project :1
Data Engineer July 2019 to Present

 Meredith Corporation Former as Time Inc is one of the world’s leading companies, with a monthly global print
audience of over 120 million and worldwide digital properties that attract more than 120 million unique visitors
each month.

 The scope of this project is to gather data related to advertising system and load it into the snowflake
Datawarehouse present on Amazon cloud. The data from Datawarehouse would then be used by the reporting
team to generate business reports.

 Involved in coordinating with Business team for understanding the business requirements and preparation of
technical design documents. Extensively involved in Requirement Gathering, Design, Analysis & Development.

 Involved in work planning, tracking and reporting at each phase of project.

 Expertise in Snowflake data modelling ELT using Snowflake SQL implementing stored procedures and standard
DWH ETL

 Worked on multiple project, Used technology python, aws and snowflake to load the data into data warehouse
 Good development experience with Python producing data applications.
 Developed End to end ETL process to bring the data from Sftp and loaded into snowflake by python API
 Using Aws S3 ,Ec2,lambda services to bring the data into our server from sftp server

 Build Data Platform in Snowflake using python framework as an ETL tool to built Enterprise
Reporting Data Warehouse on Snowflake.
 Involved in migrating TB's of data from Teradata/redshift to Snowflake.
 Experience writing complex SQL & SnowSQL.
 Have good knowledge of AWS EC2, IAM, and S3.
 Prepared the development, UAT, and production environment Migration Check list

 Built python Scripts required in the project.

 Created ETL end to end pipeline from multiple sources to load the data into Snowflake.

 Provide the maintenance, support and defects fixing after QA and prod deployments

 Attended daily Team meetings as well as Design discussions as per business requirement
 Conceptualized, Designed developed & productized new ETL pipeline Using Big data, Python, SFTP,
S3 , ec2 , API,.
 Created external and normal tables and views in Snowflake database

 Good knowledge on MicroStrategy reporting tool

Software developer Jan’2017 to June 2019 with IBM

IBM IBM Global Services US has entered into an agreement with Directv for outsourcing of Software
Application Development/Installation/Maintenance.

 As an integral part of the solution proposed by IBM Global Services to Directv, applications will be maintained
and supported on an on-going basis by IBM Global Services.

 Directv needs information technology services presently performed and managed by or for DIRECTV like
Business Intelligence (“BI”) software development, testing and project management services, BI maintenance
services, BI production support and other additional information technology services.

 Data reconciliation in various source systems and in Teradata

 With the in-depth expertise in the Teradata cost-based query optimizer, identified potential bottlenecks with
queries from the aspects of query writing, skewed redistributions etc.

 Understood & analysed the requirement documents and resolved the queries

 Migrated the data from Teradata to Hadoop ecosystem hive by using hdfs , sqoop .

 Created BTEQ, Fload, Mload scripts and fast export scripts according to client requirements and Experience on
performance tuning
 Created and updated number of mappings, reusable sessions, workflows

 Migrated the data form Teradata to snowflake data warehouse .

 Involved in unit testing and performance tuning

 Create workflow & sessions or reusable sessions

 Developed Informatica mappings and Work Flows to load the data from various sources using different
transformations like Source Qualifier, Router, Filter, Sequence Generator, and Expression

 Created BTEQ, Fload, Mload scripts and fast export scripts according to client requirements and Experience on
performance tuning.

ACADEMIC DETAILS
2016 B.E. (Computer science and Egg) with 74.2%

SIGNIFICANT HIGHLIGHTS:

• Recognized in Manager’s Choice Award in 2019 and 2020 program for created the new frame work in
python . that helps company to save cost of lambda function (5k$ Yearly )
• Recognized in Manager’s Choice Award in 2017 &2018 program for the practice put the client first.
• Teradata Certified Professional (Version: TD14 –Basic TE0-140)

You might also like