You are on page 1of 4

Ravi

Mobile: +91 6304832789


Email : rpkumars1985@gmail.com

Experience Summary:

 7 years of strong experience in the IT industry as a Microsoft Technologies and Azure


Data factory include experience in design, development, and implementation of
database systems using MS - SQL Server 2014/2012 for azure databricks Systems
applications with various business domains like Financial, pharmaceutical and
manufacturing
 Good experience on Azure Data Factory, Analysis Services,Python, Scala, Azure data
bricks, Pyspark .
 Expert in T-SQL development to write complex queries by joining multiple tables, great
ability to develop and maintain stored procedures, triggers, user defined functions
 Create automated workflows with the help of triggers
 Pumping data between Excel and Servers using Flows
 Scheduled Jobs in Flows and ADF Pipelines
 Experience in active directory
 Knowledge on Power BI

Employers:

 Worked as an Azure Data Engneeir in Infosys from September 2021 to Till Date.
 Worked as an Azure Data Engneeir in Hexaware Technologies from Aug 2019 to
September 2021.
 Worked as an .Net developer in Capgemini from Mar 2018 to Aug 2019.
 Worked as an .Net developer in Syntel from Mar 2016 to Jan 2018.

Technical Skills:

Azure ETL Tool ADF, Azure DataBricks, Adls


Azure Store Blob,Azure Sql Server
Programming Languages C#.NET, Python, Pyspark
Databases SQL Server
Services WCF, Web API
Web Technologies ASP.NET, MVC
Scripting Languages JavaScript, jQuery, Angular 6
UI HTML, CSS, BootStrap, Angular material
Editors Visual Studio IDE, Visual Studio Code
Education:
• I have completed Master of computer application(MCA) from JNTU.

PROFESSIONAL EXPERIENCE

Key Projects

1 Universal Data Lake

Description involved in understanding of business processes and coordinated


with client effectively on weekly basis to get specific user
requirements.

Contribution
 Worked on ETL(Data Extraction, Transforming and Loading)
in ADF
 Created the Linked services on data source and destination
data.
 Worked on integrating the data between On-premise
database to Azure SQL Server using Azure Data Factory
 Retrive data from Rest API using ADF
 Created pipeline using dependencies of activities in ADF
 Carried out tests for performance tuning and improving the
throughput in the Azure environment
 Use of different ADF data flow components as
Source,sort,filter, aggregator,window, rank,Target
 Extensive use of derived columns with multiple functions to
manipulate the data.
 Use of copy activity and self hosted Integration Runtime to
migrate the on-premise data onto Azure Cloud.
 Use of schedule, event based triggers to schedule the
pipeline runs.
 Implementation of SCD types using data factory pipelines
and a lot more complex scenarios.
 Use of pre and post scripts to manipulate the data before
or after it flows through the ADF pipeline.
 Use of lookup,Until, for loop activities in ADF
 Debugging pipelines using Data Flow debug session.
 Use of web activity in ADF to provide a call to the web
endpoint.
 Write query in azure databricks using PySpark .
 provide access for user with active directory
2 ARAMCO

Environment ADF ,Azure Databricks,Adls

Contribution
 Worked on ETL(Data Extraction, Transforming and Loading)
in ADF
 Created the Linked services on data source and destination
data.
 Worked on integrating the data between On-premise
database to Azure SQL Server using Azure Data Factory
 Retrive data from Rest API using ADF
 Created pipeline using dependencies of activities in ADF
 Carried out tests for performance tuning and improving the
throughput in the Azure environment
 Handled multiple migration challenges like stored
procedure calls within the mappings, unconnected lookup
workaround
 Use of variables in the expressions at Informatica level has
been translated into Azure Data Factory actual logic
implementation using locals.
 Analyze the SQL scripts and design the solution to
implement using Pyspark
 Python used in Notebooks.

3 Chassis Management

Description This web application holds the following modules.

DVIR: This module is to receive and distribute any equipment


inspection record that requires intermodal equipment to be held,
repaired and released.
M&R: This is for chassis providers to manage vendor contracts and the
associated estimate, approval and invoice process related to M&R job
orders.
Provider Declaration: This module is for chassis providers and shipping
lines to maintain their declarations for the responsible billing provider
for chassis.
Trucker Declaration: Truckers create and maintain the declarations,
rankings for their providers which are used to determine the
responsible party for chassis.
Fleet Registry: This module is for chassis providers. They can add,
update, delete and search for a particular chassis information.

Contribution  Gathering client requirements and turning them to functional


specifications.
 Design & development of core logic for both user and admin
control panel.
 Used AJAX and jQuery UI for rich user interface and
asynchronous calls.
 Regular code reviews for code optimization and performance
tuning.
 Generating stored procedures and views on normalized
database.
 Weekly calls with client regarding business requirements.

Environment MS.NET 2010, C#.NET, SQl Server 2008 R2, Javascript, JQuery, CSS.

4 Propsino Pearl

Description It is an Underwriting project for Insurance domain with 4 modules:

1. Search: This module is to facilitate fast and efficient full text search
in database as well as in application folders and subfolders.
2. Underwriting: This menu holds the proposals and rating sheets. This
menu is to prepare Medical and Financial ratings and suggests a
decision to underwrite that proposal.
3. Administration: This menu is to manage the users, groups, roles.
4. Rule Engine: This menu is to define the rules to underwrite a policy.

Environment MS.NET 2008, C#.NET, SQl Server 2008 R2, Ajax, Javascript, JQuery,
CSS.

 Gathering client requirements and turning them to functional


Contribution specifications.
 Generating stored procedures and views on normalized
database.
 Design & development of core logic for both user and admin
control panel.
 Used AJAX and jQuery UI for rich user interface and
asynchronous calls.
 Regular code reviews for code optimization and performance
tuning.

You might also like