You are on page 1of 6

1351 N Alma School Rd,#150

Chandler,AZ 85224
website: www.apisero.com

Arvind Chaudhary

SUMMARY

● Certified Snowflake Developer with ETL/SQL Development experience of more than 7


Years. Seeking challenging assignments for a career encompassing personal and
professional enhancement.
● Experience on working with Modern Replication Tools like Fivetran and HVR.
● Experience on working with DBT CLI Transformation Tool.
● SQL Server, Informatica, & Tableau, with specialized focus on database/ETL Design and
with Agile methodologies.
● Worked in various domains like Financial, Insurance and Legal Extensive working
knowledge in Snowflake, MS SQL Server,Python, Informatica.
● Involve in various projects related to Data Modeling, Data Analysis, Design and
Development for both OLTP and OLAP Data warehousing environments.
● Experience in SQL, PL/SQL Programming and T-SQL.
● Involved in Migrating database Objects from SQL Server, Salesforce, Zendesk and Jira to
Cloud (Snowflake & AWS)
● Expertise in Snowflake – data modeling, ELT using SnowSQL, implementing complex
stored Procedures, Task, Streams and standard DWH and ETL concepts.
● Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Tasks.
● Strong experience in Creating Database Objects such as Tables, Views, Functions, Stored
Procedures, Indexes, Triggers, Cursors in SQL Server
● Expert in developing stored procedures, Views, UDFs, Triggers, Performance Monitoring.
● Strong analytical and statistical skills for design of experiment, data analysis, and
reporting.
● Having experience of gathering Requirements, Designing, and Planning/Scheduling and
Developing.
● Achievement of goals in the areas of product performance reliability, customer satisfaction
and production quality.
● Good experience in developing and maintaining tableau data sources, data extract, data
security and server-side activities.

CERTIFICATION

● SnowPro Certified Developer

TECHNICAL SKILLS

Copyright ® 2020 Apisero, Inc. All Rights Reserved.


1
OS Windows, Linux and Unix

Scripting Languages Python and Javascript


Database Snowflake, Microsoft sql server 2008 & 2014 and Oracle

BI & ETL DBT CLI, Informatica Powercenter, Fivetran and HVR (Replicator)
Report Tool Tableau and PowerBI

Configuration Tools Jira, GIThub, Bitbucket, Bamboo and Azure Devops

WORK EXPERIENCE

Client:New American Funding


Role: Senior Lead Engineer
Description: Migration of on-premises SQL server database to Snowflake hosted on Azure cloud.
After 1 Time migration enabling the CDC to generate the live Tableau Business Reports. Converting
the SSIS SQL packages to snowflake stored procedures and views.
Responsibilities:
● Gather the different volumetrics like Total number of tables, space occupied by different
tables, index size, total occupied size, number of partition groups etc.
● Installation of agent over on prem SQL server to encrypt the data while reading from SQL
Server and installation of hub on client VM where decryption occurs and act as a staging
area for snowflake.
● Migrating the tables using HVR Tool and enabled the CDC using HVR.
● Converting the SQL Server Stored procedures and views to work with
Snowflake.
● Loading of semi structured data over to snowflake.
● On going activity to connect clients' different source systems to snowflakes.
Environment/ Applications Used: Snowflake, Azure BLOB, HVR hub, Azure devops, GIT and
Tableau.

Client: Abacus Next


Role: Senior Lead Engineer
Description:Migration of on-premises SQL server database, Zendesk, Salesforce, JIRA ticketing data
to Snowflake hosted on AWS cloud. After bringing data to the staging layer, transform the data using
the DBT CLI tool and Building the DWH layer for ad-hoc reporting.
Responsibilities:
● Gather the different volumetrics like Total number of tables, space occupied by different
tables, index size, total occupied size, number of partition groups etc.
● Migrating the tables using Fivetran Tool and enabled the CDC using Fivetran.
● Migration of Zendesk, salesforce, Jira API data using Fivetran replication tool.
● Transformation of Data using DBT CLI Tool.
● Implementation of SCD2 using snapshot functionality in DBT.
● Loading of semi structured data over to snowflake.
● Ongoing activity to connect MongoDB with Fivetran and migration of Data to
snowflake cloud.

Copyright ® 2020 Apisero, Inc. All Rights Reserved.


2
Environment/Applications Used: Snowflake, Amazon S3, FiveTran ,DBT CLI Transformation, MS
SQL, Salesforce, Jira, Zendesk and Azure devops, GIT, Snowflake Task for Orchestration and
Tableau.

Client: Comcast - ESG Rating


Role: Lead Engineer
Description: ESG Rating is a platform created for analysts to give ESG Ratings to Fixed Income and
Equity Companies. Before providing the ratings, analysts need to submit answers to various
Environment, Social and Governance questions. This justifies the rating being given. The mapping of
FI company with EQ company is on boarded from FIL internal Sources.
Responsibilities:
● Gathering Pricing, share outstanding, MSCI data for different data points related to E,S and
G pillars from different sources like Bloomberg, Factset,MDM, MSCI etc.
● Loading the data to DB with the help of Integration Tools - Power center
Informatica/PythonLoaders.
● Writing SQL complex Query in SQL Server database.
● Creating/modifying database objects such as Table, View, Cursor, Procedure, Function,
Triggers, etc.
● Writing SQL Loader to read data from external files to integrate with applications.
● Performance tuning of SQL queries and optimizing the index rebuilding jobs.
● Created, gathered and analyzed data dashboards from different sources.
● Generated advanced Tableau dashboards with quick/context/global filters, parameters and
calculated fields that allowed to track and improve customer units KPI by 12% within a month
● Working with end users, Business owners, Operations team, technical staff, and project
team members to plan, design, develop, implement, and enhance business analytics
capabilities using Tableau’s Business Intelligence tools for requirement gathering,
designing and creating Tableau Dashboard/SSRS reports.
● Requirement Gathering from Business users, design and implement ETL process, manage
and maintain databases, create table, stored procedures, database development and data
analysis on existing Stored procedures, Triggers, Functions, Cursors and perform Data
analysis.
● Wrote stored procedures and User Defined Scalar Functions (UDFs) to be used in the
SSISpackages, SQL scripts, T-SQL Programming techniques in query optimization and
performance tuning.
● Reviewing current operational data structures and data flows and recommend
optimizations and opportunities for automation.
● Participating in client engagement meetings for developing plans and strategies of data
management processes and IT programs for the clients, providing hands-on assistance in data
modeling, technical implementation and best practices.
● Debugging, monitoring and troubleshooting BI solutions.
● Converting the SQL Server Stored procedures and views to work with Snowflake.
● As part of L3 group available for on-call support on rotation basis to troubleshoot
production issues in a timely manner.
● Extracted data from sources and transformed the data using different transformations like
data conversion, derived column, look up, Conditional Split, Aggregate, Union all, Row
Count and merge join transformations.
Environment/ Applications Used: Informatica Powercenter, Microsoft sql server, BMC Control M
scheduler , Linux and Jira.

Client:Credit Rating

Copyright ® 2020 Apisero, Inc. All Rights Reserved.


3
Role: Lead Engineer
Description:Credit rating is a platform created for analysts to submit the different types of rating on
FI Companies,this platform supports ABC, outlook, Fundamental and RV ratings. Using this
platform analysts can update the ratings on companies according to their researches.The rating given
will be sent to different consumers with the help of overnight loaders. The mapping of FIssure with
DEBT Ticker Created internally.Rating expiry mail will be Triggered to Portfolio Managers whom
the analysts are reporting based on expiry threshold logic for alpha and beta ratings.
Responsibilities:
● Gathering FI metadata like coupon Rate, FI bond exposure, tier code, governing law,
fall back category etc. from FIL internal sources.
● Loading the data to DB with the help of Integration Tool - Power center
Informatica/PythonLoaders.
● Writing SQL complex Query in SQL Server database.
● Creating/modifying database objects such as Table, View, Cursor, Procedure, Function,
Triggers, etc.
● Writing SQL Loader to read data from external files to integrate with applications.
● Performance tuning of SQL queries and optimizing the index rebuilding jobs.
Environment/ Applications Used: Informatica Powercenter, Microsoft sql server, BMC Control M
scheduler, Linux and Jira.

Client:Analyst Model Portfolio


Role:Senior Software Engineer
Description:AMP is a platform created to evaluate the analyst’s Year End performance appraisal
cycle based on the fund performance. In this platform analysts from different locations around the
world create paper Funds and map those funds with Custom benchmarks or Standard benchmarks
to measure the performance of the Funds. Corporate actions are applied just like the real world and
Trades are simulated. Various financial reports are fetched with the help of SQL procedures from the
database and finally returns are calculated on daily, monthly, quarterly and yearly basis.
Responsibilities:
● Gathering Pricing, Exchange rate, corporate actions related data from different
sources like Bloomberg, Factset etc.
● Loading the data to DB with the help of Integration Tool Power center Informatica/Python
Loaders.
● Writing SQL complex Query in SQL Server database.
● Writing stored Procedure to daily rebalance the custom benchmarks based on MarketData.
● Creating/modifying database objects such as Table, View, Cursor, Procedure, Function,
Triggers, etc.
● Writing SQL Loader to read data from external files to integrate with applications.
● Performance tuning of SQL queries and optimizing the index rebuilding jobs.
Environment/ Applications Used: Informatica Powercenter , Microsoft sql server, BMC Control M
scheduler , Linux and Jira.

Client:Research Insight
Role:Senior Software Engineer
Description:Insight is a platform where analysts submit various data models on stocks assigned to
them. The actual and estimated data is saved in the database. Various other measures are derived
from the raw data via calculation engine. The calculated financial measures along with market data
helps analysts to publish the research on stocks and give ratings to stocks. Through various database
procedures, this data is further used for reporting purposes.
Copyright ® 2020 Apisero, Inc. All Rights Reserved.
4
Responsibilities:
● Writing SQL complex Query in SQL Server database.
● Responsible for writing DB stored procedures/functions.
● Working on Power center Informatica Tool for Extracting, transforming and Loading the
data.
● Involved in writing and delivering technical artifacts, Requirement Specification, Impact
Analysis, Design specification, Test Cases, Release & Deployment documents.
● Working on Python Programming Language and developing Loaders to Load data into
Data warehouse using Libraries like Numpy array, Pandas DataFrame, sqlalchemy for
creating the engine and event.
● Performed data cleaning, prepping and finding missing values in a large data set.
● Used machine learning techniques and scripting language – iPython (Jupyter).
● Made predictions based on the data patterns and calculatedROC.
● Used Pandas framework to import and work on huge data sets irrespective of format of
the data And worked better than excel in this project.
● Plotted graphs to visualize the data using Matplotlib Packages
● Numpy was used to do mathematical calculations on data and to find the trends from the
data.Also, used linear regressions on train and test data for predictive analysis.
● Having good knowledge on Hadoop and Hive Architecture.
Environment/ Applications Used: Informatica Powercenter , Microsoft sql server, BMC Control M
scheduler , Linux and Jira.

Client:Sparta
Role:Software Engineer
Description:Sparta is Financial Data reporting Templates which fetches the data from Data Mart.
Responsibilities:
● Interacted with BA for requirement gathering.
● Involved and responsible for writing and delivering technical artifacts, Requirement
specification, Impact Analysis, Design specification, Test Cases, Release & Deployment
documents.
● Responsible for writing DB stored procedures/functions.
● Involved in supporting the System Test Environment to fix the testing issues.
Environment/ Applications Used: Informatica Powercenter , Microsoft sql server, BMC Control M
scheduler , Linux and Jira.

Client:MassMutual Data Management


Role:Software Engineer
Description:Financial services company providing life, annuities, disability income, long-term care,
retirement, trust services, and money management products and services. This is an Insurance-
Policy & Billing Center domain project which is based on Data Warehousing and Business
Intelligence technology using OBIEE (BI) and SQL server2012(DB).
Responsibilities:
● Thoroughly understand business practices and procedures to design, develop and
maintain automated workflow processes.
● Creating/modifying database objects such as Table, View, Cursor, Procedure, Function,
Triggers, etc.
● Writing SQL Loader to read data from external files to integrate with the application.
Environment/ Applications Used: Informatica Powercenter , Microsoft sql server, BMC Control M
scheduler , Linux and Jira.

Copyright ® 2020 Apisero, Inc. All Rights Reserved.


5
EDUCATION
● B.Tech from ABES Engineering College

Copyright ® 2020 Apisero, Inc. All Rights Reserved.


6

You might also like