You are on page 1of 6

HARISH D

Mobile: +91- 7780543585


E-Mail: harishd2356@gmail.com
___________________________________________________________________________

Professional Summary
 Having 4+ years of experience in Microsoft Azure Cloud technologies.
 Experience on Migrating SQL database to Azure data Lakestore, Azure data lake
Analytics, Azure SQL Database and Controlling and granting database access and
Migrating On premise databases to Azure Data lake store using Azure Data
factory
 Created ResourceGroup, StorageAccount, Pipelines, datasets, linkedServices and
invoking the Pipeline using the PowerShell
 Hands-on experience in Azure Analytic Services – Azure Data Lake Store (ADLS)
Gen1 and Gen2, Azure Data Factory (ADF) and (Basic)Azure Data bricks.
 Arranged data integration pipelines in ADF using various Activities like Get
Metadata, Lookup, For Each, Wait, Execute Pipeline, Set Variable, Filter, etc.
 Implemented dynamic pipeline to extract the multiple files into multiple targets
with the help of single pipeline.
 Automated execution of ADF pipelines using Triggers, Runbooks
 Have knowledge on Basic Admin activities related to ADF like providing access
to ADLS using service principle, install IR, creation of services like ADLS, logic
apps etc.
 Extensively worked on data source types – SQL server, Oracle, flat-files, JSON,
CSV and Excel.
 Designed Azure Logic app to send pipeline success/failure alert emails, files
unavailability notifications etc
 Have documentation skills.
 Pipeline execution methods (Debug VS Triggers).
 Monitor and manage Azure Data Factory.
 Involved in production support activities.
 Basic Knowledge on Azure Data bricks, creation of Notebooks
 Created the mount points to read data from ADLS Gen2 and ingest into the Azure
SQL DB using Azure Data bricks.
 Extensively used ETL methodology for supporting Data Extraction
 Knowledge on PowerBI

Education:
Completed my B.Tech in Electronics and Communication Engineering from Jawaharlal
Nehru Technological University

Certifications:
Microsoft Certified: Azure Fundamentals
Credential ID: 992064599
Credential URL : https://www.credly.com/badges/bbc9c49a-6b86-4f90-9939-
81e043a9f248

Professional Experience:
Currently working as a Software engineer in Saxon.ai

Technical Skills:
Cloud Platform: Microsoft Azure
Operating System: Windows, Linux
Languages: Basic python, SQL
Database: Sql server, Oracle, Azure DW, synapse
Tools: Azure Data Factory, Azure data lake Storage, Azure Data Bricks(basics)
Others: Jira, confluence, Azure DevOps, Power BI, Tableu, Sharepoint

Project 1:

Organization : Saxon.AI
Customer : ConcertGroup Inc
Period : November 2021 to February 2022
Description: : ConcertGroup Inc is a Insurance company. Concert Group was founded
in Chicago, Illinois by successful industry entrepreneurs
responding to significant fronting demand within their diverse client bases.

Roles & Responsibilities:

 Develop database solution for Insurance dataset with master and reference
schema.
 Implement efficient pipelines in the ADF to import data from the Local system to
Azure SQL.
 Import and transform data using Power Query.
 Created various Power BI Insurance Reports involving variety of features like
Bar Charts, Line Charts, Filters, Custom Visuals, Drill Down etc.
 Built forecasting using parameters, trend lines and reference lines that were
reported manually before this solution.
 Scheduled Automatic refresh and scheduling refresh in power bi service.
 Added dynamically working measures for automating reports based on clients’
requirements
 Implemented RLS.
 Pivot/Unpivot Transformation.
 Assisted customer with analysis of the Claims Incurred vs Premium Earned for a
year based on their different LOB.
 Worked closely with the client for efficient and better results.
 Environment – Microsoft Azure, Azure Data Factory, Azure SQL DB, Databricks,
Power BI, Azure Analysis Service,
 Azure DevOps, Data Lake Storage, SSMS.

Project 1:

Organization : Saxon.AI
Customer : Insightbox (Internal product)
Period : February 2021 to June 2022
Description: : Saxon is a data and analytics company with end-to-end data
engineering and analytics services leveraging next-gen technologies from
leading market vendors. Our core differentiators are robust frameworks, strong domain
expertise, DataOps, and accelerators that provide quick time to value and maximize ROI.

Roles & Responsibilities:


 Responsible for building the data and reporting infrastructure from the ground
up using Power BI and SQL
 provide real time insights into the product, marketing funnels, and business
KPIs.
 Built the data source for various domains like Manufacturing, Retails, Healthcare
based on the real time data from online sources.
 Built the ETL process on Azure Data Factory with organized manner and
prepared them for further analytics
 work, including basic transformation like eliminating the redundancy, and
mapping before loading into the Datawarehouse.
 Responsible for designing the data storage system ensuring database design is
efficient in forms like star schema.
 Worked on mapping the fields of the Views from the Source of the View as per
the business requirements.
 Created Dax Queries to generated computed columns in Power BI.
 Generated computed tables in Power BI by using Dax.
 Involved in creating new stored procedures and optimizing existing queries and
stored procedures.
 Used Power BI, Power Pivot to develop data analysis prototype, and used Power
View and Power Map to visualize reports
 Published Power BI Reports in the required originations and Made Power BI
Dashboards available in Web clients and mobile apps
 Explore data in a variety of ways and across multiple visualizations using Power
BI.
 Environment – Microsoft Azure, Azure Data Factory, Azure Synapse, Databricks,
Power BI, Azure Analysis Service, Azure DevOps, Data Lake Storage.

Project 3:

Organization : Jaguar Land Rover-P-IVI


Customer : Jaguar Land Rover
Period : January 2021 to till date
Description: Jaguar Land Rover is a British multinational automotive company which
produces luxury vehicles and sport utility vehicles. Jaguar Land Rover is a part of Indian
automotive company Tata Motors Limited. The principal activity of Jaguar Land Rover
Limited is the design, development, manufacture and sale of vehicles bearing
the Jaguar and Land Rover marques

Roles & Responsibilities:

 Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract,


Transform and load data from different sources like Azure SQL, Blob storage,
Azure SQL Data warehouse, write-back tool and backwards
 Implemented ETL activities using ADF pipelines to load data from on premise
source system to ADLS Gen2.
 Created Linked Services for multiple source system (i.e.: SQL Server, ADLS Gen1
and Gen2, Blob and Table Storage).
 Created Pipeline’s to extract data from on premises source systems to azure
cloud data lake storage gen2; Extensively worked on copy activities and
implemented the copy behavior’s such as flatten hierarchy and Merge hierarchy;
Implemented Error Handling concept through copy activity.
 Exposure on Azure Data Factory activities such as Lookups, if condition, for each,
Set Variable, Append Variable, Get Metadata, Filter and wait.
 Configured the logic apps to handle email notification to the end users and key
shareholders with the help of web services activity; create dynamic pipeline to
handle multiple source extracting to multiple targets.
 Configured and implemented the Azure Data Factory Triggers and scheduled the
Pipelines; monitored the scheduled Azure Data Factory pipelines and configured
the alerts to get notification of success and failure pipelines.
 Worked on data frames to process the structured CSV files data to load into
Azure SQL DB.
 Extensively worked on SQL Queries, Stored Procedures.
 Implemented delta logic extractions for various sources with the help of control
table.
 Implemented the Data Frameworks to handle the recovery, logging the data of
pipelines.
 Extensively Providing dependency between multiple Data Factories.

Project 4:
Organization : MOLIPS
Customer : ONE(Shipping Logistics)
Period : July 2018 to December 2020
Description: MOL (Mitsui O.S.K. Lines) was launched in 1964. Mol is a Japanese
Transport Company headquartered in Japan. It is one of the largest shipping companies
in the world. MOL fleet includes dry cargo ships (bulk carriers), liquefied natural
gas carriers, Ro-Ro Car Carrier ships, oil tankers, container ships (among which MOL
Triumph is the 4th largest containership in the world), and container terminals. Focus
on containers shipping has been reduced since April 2018

Roles & Responsibilities:

 Sending all export documents like Shipping Bill, Survey Report and other
documents to the agents for stuffing and loading the cargos on Vsl.
 Supervising cargo operations, updating principals about vessel performance &
finalizing disbursement account.
 Effectively liaising with port, terminal & statutory authorities for quick
deliveries of bulk consignments with no claims for damages or other causes.
 Monitoring all pending clearances and delivery order collections; sending timely
reminders customers for ensuring that all sea bills, invoices and documents are
prepared accurately.
 Created Pipeline’s to extract data from on premises source systems to azure
cloud data lake storage; Extensively worked on copy activities and implemented
the copy behavior’s such as flatten hierarchy, preserve hierarchy and Merge
hierarchy; Implemented Error Handling concept through copy activity
 Exposure on Azure Data Factory activities such as Lookups, Stored procedures, if
condition, for each, Set Variable, Append Variable, Get Metadata, Filter and wait
 Configured the logic apps to handle email notification to the end users and key
shareholders with the help of web services activity; create dynamic pipeline to
handle multiple Source extracting to multiple targets; extensively used azure key
vaults to configure the connections in linked services
 Configured and implemented the Azure Data Factory Triggers and scheduled the
Pipelines; monitored the scheduled Azure Data Factory pipelines and configured
the alerts to get notification of failure pipelines
 Created Azure Stream Analytics Jobs to replicate the real time data to load to
Azure SQL Data warehouse
 Implemented delta logic extractions for various sources with the help of control
table; implemented the Data Frameworks to handle the deadlocks, recovery,
logging the data of pipelines.

Project 5:

Organization : 24/7
Customer : DirectTV Now(streaming service)
Period : July 2017 to June 2018
Description: DirectTv Now is a streaming service. DirecTV Stream is a family of
streaming multichannel television services offered in the United States by DirecTV.
Roles & Responsibilities:

 Solving technical issues of customers via chat


 Handling escalations from mail box was the crucial activity where we need to
respond to the supplier’s query quickly and accurately.
 Facilitating conference calls with counterparts for smooth functioning of the
processes and updating about them in team huddles.
 Ensure quality standards set by Clients are delivered.

(Dudam Harish)

You might also like