You are on page 1of 6

SAMBAIAH MITTA

Mobile: +91-9000998613
Email:mittasamba959@gmail.com

PROFESSIONAL SUMMARY

 Having 7+ years of IT experience with 3+ Years of extensive experience in Snowflake cloud and Oracle
databases.
 Having strong command on SQL Queries and database objects.
 Implemented solutions using snowflake’s data sharing, cloning and time travel.
 Implemented role-based access control in snowflake.
 Proficient in implementing Data Lake Solutions to build Enterprise Data Hub .
 Well Versed in Data Warehouse Architecture, Data Modelling, Concepts and experienced in designing
the dimensional models (Star and Snowflake Schemas) and implementing Slowly Changing Dimensions.
 Strong Expertise in working with cloud database Snowflake. Implemented end to end Snowflake
Architecture.
 Having good Experience in creating external stages, Snow pipes, streams, tasks, File formats,
procedures, setting up Snowflake integration with AWS
 Expert and hands on experience in integrating very complex xml files to snowflake.
 Expert and hands on experience in different files csv, xlsx, Json, Parquet, ding data from different files
and databases and transformation over data.
 Implemented Data ingestion strategies in snowflake using Snow pipe and external tables.
 Have good experience in architecting data pipeline solutions using AWS, Snowflake
 Implemented orchestration and transformation jobs using Azure data factory ETL tool.
 Good Understanding of Agile - SCRUM methodologies.
 Good in understanding of Business logic and ability to work well as a part of a team and individual
environment with minimal supervision.
 Ability to learn new languages and tools quickly.
 Excellent written and oral communication skills.
 Having experience in connecting to different sources likeon premise SQL server, VirtualMachines, Azure SQL
database, data lake storage and blob storage from ADF.
 Proven proficiency at Data Transformations like Lookup, Derived Column, Conditional Split, sort, Data
Conversion, Union All and Merge Join.
 Experience in creating the Azure blob, data lake storage and SQL DB
 Experienced on Logical App for configuring the web activity on pipeline, trigger status.
 Creating the triggers, pipelines as per client’s requirements.
 Scheduling the Reports Weekly and Monthly basis as per the Client Requirements.

EXPERIENCE SUMMARY

I was joined in Process associate as a freasher in Cognizant at


2016 I was moved to Snowflake developer & Adf in 2021
I am working as Senior Associate Consultant in Infosys at 2022 till date.

EDUCATION DETAILS:

I was Completed B.com(computers) from MasterJi Degree College at Warangal in K.U -2012 with 65%
I was Completed MBA(Finance) from Sphoorthy Engineering College at Hyderabad in J.N.T.U-2015 with 70%
TECHNICAL SKILS:
Database Tools: Snowflake , Oracle ,Azure data factory.
ETLTools: Azure
data factory
Database: Oracle
9i, Oracle 10g, 11g.
Programming:Sql
and pyspark
Operating Systems: Windows 2000/XP/2007/, Unix

Project Experience:

Project: 1

Project Name DI SYSTEM


Client DENNY’S Corporation

Project Description:

DENNY’S is a table service diner-style restaurant chain. It operates over 1,600 restaurants
in the UnitedStates ), Canada, United Kingdom. Denny's is now known for always being
open and serving breakfast, lunch, and dinner around the clock. Denny's does not close on
holidays and nights.
This project deals with building a Data Lake for Denny’s Warehouse System .This project
involves migrating on- premise systems data like Oracle, MySQL and Postgres to AWS data
lake(S3) and then ingesting the data into snowflake

Roles and Responsibilities

 Analyzing the requirements, developed solutions, designing approach for theintegrations.


 Involved Design snowflake data model and cloud data warehouse
 Responsible in integrating the log files to AWS s3 and then to snowflake
 Design the Audit log framework and explain team the requirements .
 Automate the existing process and design the generic functions which could
be reusable different projects/applications.
 Preparation of Source to Target and Business Process Mappings
 Use of Agile methodologies.
 Worked on solutioning, planning and execution of efforts, estimation, and timelines of the
work
 Assisting the Developers Team with complex query tuning and schema refinement;
providing 24x7 support for critical production systems
 Implemented snow pipe for real-time data ingestion.
 Interacting with client and getting the requirements.
 Development of pipelines, Linked services, Datasets in Azure Data Factory version2.
 Creating a various ADF pipeline to achieve the business scenario.
 Configuration of Azure Cloud services that includes Azure Blob, Azure SQL Db.
 Scheduling the pipelines based on tumbling window for automation job in ADF pipeline
Project: 2

Environment: Oracle 12C SQL and Snowflake Cloud Data Base.


Project Details:
Client : BT Group plc.
Duration : May 2022 To Till date.
Position : Team Member.
Environment : Snowflake,

Project Description:
BT is one of the world's leading communications services companies,
serving the needs of customers in the UK and in more than 170 countries
worldwide. The group's main activities are the provision of fixed-line services,
broadband, mobile and TV products and services as well as networked IT
services. In the UK, BT is a leading communications services provider, selling
products and services to consumers, small and medium sized enterprises and the
public sector. BT also sells wholesale products and services to communications
providers in the UK and around the world. Globally, the company supplies managed
networked IT services to multinational corporations, domestic businesses and
national and local government organisations. For more information, visit
www.btplc.com

Roles and Responsibilities


 Creating the technical specification for coding approach based on requirement.
 Invoke SQL LOADER to load data into staging table after receiving flat file from CRM
 System on daily basis.
 Creating database triggers for monitoring data transactions on daily basis.
 Responsible for all activities related to the development, implementation, administration
and support of ETL processes for large scale Snowflake cloud data warehouse.
 Bulk loading from external stage (AWS S3) to internal stage (snowflake) using COPY
command.
 Loading data into Snowflake tables from internal stage and on local machine
 Writing complex SnowSql scripts in Snowflake cloud data warehouse to Business Analysis
and reporting.
 Involved in Performance tuning of sessions, mappings, data models for Data warehouse
supporting Involved in building the ETL architecture and Source to Target mapping to
load data into Data warehouse
 Responsible for all activities related to the development, implementation, administration
and supportof ETL processes for large scale data warehouses using Snowflake cloud Data
warehouse.
 Create task flows to automate the data loading/unloading to/from Snowflake data
warehouse and AWS S3
 Creating the triggers, pipelines as per client’s requirements.
 Scheduling the Reports Weekly and Monthly basis as per the Client Requirements.
 Experience in Implementing Data Warehouse concepts like Star Schema, Dimensions and
Fact Tables.
 Experience in using My-SQL database.
 Having Experience in AZURE Key vaue
(SAMBAIAH MITTA)
Confidential Page 4 of 5

You might also like