You are on page 1of 4

NARSIMLU .

D
Phone: +91-9491562172
Email: narsimlu9491@gmail.com

SYNOPSYS:

 3.7 years of global IT experience with Software development, Integration, Testing, and Support
in handling Global clients based in the USA, Canada, and the Netherlands.
 Hands-on experience in – Azure Data Lake Store (ADLS), Azure SQL, Azure Data Factory (ADF),
Azure Data Bricks (ADB), Azure DevOps, Azure Synapse Analytics, Azure BLOB storage, Azure
Logic App, Key Vault, SQL Server, PySpark, Spark SQL, etc.
 Excellent knowledge of ADF building components – Integration Runtime, Linked Services, Data
Sets, Pipelines, Activities.
 Designed and developed data ingestion pipelines from on-premise to different layers into the
ADLS using Azure Data Factory (ADF V2).
 Extensively worked on Azure Logic App integration tool.
 Configured environment details in Azure Key Vault.
 Experience in building Azure Data Bricks (ADB) Spark-Python Notebooks to perform data
transformations.
 Hands-on experience in Data Frames, and Spark SQL.
 Designed and developed an audit/error logging framework in ADF.
 Orchestrated data integration pipelines in ADF using various Activities like Get Metadata,
Lookup, For Each, Wait, Execute Pipeline, Set Variable, Filter, until, etc.
 Implemented a dynamic pipeline to extract multiple files into multiple targets with the help of a
single pipeline.
 Automated execution of ADF pipelines using Triggers.
 Know about Basic Admin activities related to ADF like providing access to ADLS using service
principles, installing IR, creating services like ADLS, logic apps, etc.
 Extensively worked on data source types – SQL server, flat files, JSON, CSV, etc.
 Experience in SQL – Joins/co-related queries/sub-queries etc.
 Good knowledge of tables, stored procedures, views, etc.
 Good Knowledge of SQL performance tuning and Optimization.
 Designed Azure Logic apps application to send pipeline success/failure alert emails, file
unavailability notifications, etc.
 Exposure to multiple domains like Insurance, Retail, and Catalyst business.
 Connected with the product owners to get the business requirements by the sync-up calls.
 Having knowledge of Agile/Scrum methodology in Azure DevOps.
 Involved in Azure DevOps CI/CD implementation for the deployed codes to multiple
environments.

TECHNICAL EXPERTISE:

Specialized Skill:
Azure Data Factory V2, Azure Data Bricks, ADLS, Azure Logic App,
Azure Devops, Azure Synapse Analytics, Key Vault, MS SQL
SQL Server, PySpark.

1 of 4
Project #3 : July 2022 to Aug 2023
Company name : Affluent Global Services Pvt Ltd
Project Name : Einstein Project
Client : Shell
Role : Data Engineer
Environment : Azure Data Factory V2, Azure Data Bricks, Azure DevOps, Key-vault,
Azure Synapse Analytics, ADLS, Logic App, MS SQL, PySpark.

Project Description:
Shell is a global group of energy and petrochemical companies that aims to
meet the world’s growing need for more and cleaner energy solutions in ways that are
economically, environmentally, and socially responsible. Shell is one of America's largest oil and
natural gas producers, natural gas marketers, and fuel marketers. Shell is extracting echo-friendly
products from crude oil by using good catalysts.

Responsibilities:

 Designed and developed data ingestion pipelines from on-premises to different layers into the
ADLS using Azure Data Factory (ADF V2).
 Created Azure Data Bricks (ADB) Spark-Python Notebooks to perform data transformations by
business needs.
 Extensively worked on copy activities and implemented copy behaviour such as flatten
hierarchy, preserving hierarchy, and Merge files.
 Extensively worked with mapping Dataflow in Azure Data Factory.
 Created Linked Services for multiple source systems (i.e.: Azure SQL Server, ADLS, BLOB).
 Extensively used Azure Data Factory activities such as Lookup, Stored procedures, if condition,
for each, Set Variable, Append Variable, Get Metadata, Filter, execute pipeline, and wait, etc.
 Configured the logic apps to handle email notifications to the end users and key shareholders
with the help of web services activity.
 Create a dynamic pipeline to handle multiple sources extracting to multiple targets.
 Extensively used Azure key vaults to configure the connections in linked services and in mount
points.
 Configured and implemented the Azure Data Factory Triggers and scheduled the Pipelines.
 Monitored the scheduled Azure Data Factory pipelines and configured the alerts to get
notifications of failure pipelines.
 Validated the data which is coming from the servers by the Power BI dashboard.
 Extensively worked in SQL – Joins/co-related queries/sub-queries, stored procedures, views.
 Extensively used Azure DevOps for better product delivery, and faster issue resolution by
creating the Epic, PBI, and sprints within the team.
 Deployed the codes to multiple environments with the help of the CI/CD process and worked
on code defects during the SIT and UAT testing and provide support to data loads for testing.
 Developing Spark (Python) notebooks to transform and partition the data and organize files in
ADLS.
 Working on Azure Databricks to run Spark-Python Notebooks through ADF pipelines.
 Using Data bricks utilities called widgets to pass parameters on run time from ADF to Data
bricks.
 Review individual work on ingesting data into Azure data lake and providing feedback based on
reference architecture, naming conventions, guidelines, and best practices.
 Involved in ADLS GEN1 to GEN2 Migration process.
 Created master documents for MVPs and created mapping sheets for the QA testing team.
 Created mount points and clusters in Azure Data Bricks.
 Good exposure on GitHub.

2 of 4
 Collaborated with the product owners to get the business requirements by the sync-up calls.
Project #2 : Jan 2022 to July 2022
Company name : Affluent Global Services Pvt Ltd
Project Name : Johnson Control Inc
Client : JCI
Role : Data Engineer
Environment : Azure Data Factory V2, Azure storage,BLOB, Data Bricks,Logic App, Key
Vault,SQL Server, PySpark.

Project Description:
Johnson Controls is a global multi-industrial company with core businesses
in the automotive, building, and energy storage industries. Unity is designed to implement one
technology solution across their global operations to enable the integration of key business
functions, provide access to common data and effectively deliver a sustainable, competitive value to
their customers.

Responsibilities:

 Created Linked Services for multiple source systems (i.e.: Azure SQL Server, ADLS, BLOB).
 Created a Pipelines to extract data from on-premises source systems to Azure storage Account.
 Extensively worked on copy activities and implemented copy behaviour such as flatten
hierarchy, preserving hierarchy, and Merge files.
 Implemented Error Handling concept through copy activity.
 Extensively worked with Azure Data Factory activities such as Lookup, Stored procedures, if
condition, for each, Set Variable, Append Variable, Get Metadata, Filter, and Wait.
 Configured the logic apps to handle email notifications to the end users and key shareholders
with the help of web services activity.
 Create a dynamic pipeline to handle multiple sources extracting to multiple targets.
 Extensively used Azure key vaults to configure the connections in linked services.
 Configured and implemented the Azure Data Factory Triggers and scheduled the Pipelines.
 Monitored the scheduled Azure Data Factory pipelines and configured the alerts to get
notifications of failure pipelines.
 Extensively worked in SQL – Joins/co-related queries/sub-queries, stored procedures, views.
 Good understanding of indexing, querying, and normalization.
 Extensively worked on Azure Databricks to implement SCD-1, SCD-2 approaches.
 Implemented delta logic extractions for various sources with the help of a control table.
 Deployed the codes to multiple environments with the help of the CI/CD process and worked
on code defects during the SIT and UAT testing and provide support to data loads for testing.
 Developing Spark (Python) notebooks to transform and partition the data and organize files in
ADLS.
 Working on Azure Databricks to run Spark-Python Notebooks through ADF pipelines.
 Using Data bricks utilities called widgets to pass parameters on run time from ADF to Data
bricks.

3 of 4
Project #1 : Jan 2020 to Jan 2022
Company name : Aerolite Industries Pvt Ltd
Project Name : Ashok Leyland Enterprise BI Implementation
Client : Ashok Leyland
Role : SQL Developer(Senior Electrical Engineer)
Environment : Azure Data Factory V1,Azure storage Account,SQL Server.

Project Description:

Ashok Leyland Limited (AL), the flagship company of the Hinduja Group of
companies, is the second largest commercial vehicle manufacturer in India.
Ashok Leyland Limited has chosen HP as its IT partner to provide Infrastructure, System
Integration, and Management Services.

Responsibilities:

 Designed the relational databases and written code that interacts with stored data to complete
functional requirements for a business.
 Organized data into tables, specifying data types, primary and foreign keys, and other
constraints.
 Created tables,stored procedures, and views as per business requirements.
 Written complex queries using SQL – Joins/co-related queries/sub-queries for applications and
business intelligence reporting.
 Good understanding of indexing, querying, and normalization.
 Analyzed queries, develop security protocols, and resolve problems.
 worked with business analysts, database administrators, and other IT professionals to help
companies create and maintain databases to control and manipulate their data.
 Created Linked Services for multiple source systems (i.e.: Azure SQL Server, ADLS,).
 Created Pipelines to extract data from on-premises source systems to Azure Storage Account.
 Extensively worked on copy activities and implemented copy behavior such as flatten
hierarchy, preserving hierarchy, and Merge files.

PROFESSIONAL EXPERIENCE

Affluent Global Services Private Limited - As Data Engineer -– Jan 2022 – Aug 2023

Aerolite Industries Private Limited –As Sr Electrical Engineer-- Jan 2020 – Jan
2022

ACADEMIC QUALIFICATION

 B. TECH From Jawaharlal Nehru Technological University(2012), HYDERABAD.

4 of 4

You might also like