You are on page 1of 5

PRAMOD KUMAR

Mobile No: +91-7204399589


Email: inbox2pramod@gmail.com

I have 9+ years of experience in Business Intelligence Technology with interpreting and analyzing data in
order to drive successful business solutions. Seeking a challenging role where I can apply my proven skills to
provide efficient and scalable solutions to meet the business requirements and be an asset to the organization.
Work Experience Summary:
 My primary focus area in terms of technical capabilities is ETL development in major data
warehousing implementation using a plethora of tools and technologies, Data quality and
optimization, postproduction support, etc.
 My functional expertise steam from being a part of projects Business Mapping/
Development/Warranty phases mostly in Banking, Petroleum & Gases, Healthcare, Sales,
Insurance, and Telecom domain which involve business territories like Europe, USA, and South
East Asia.
Professional Summary:
 Experience in the full life cycle of software project development in various areas like design,
Application development of Enterprise Data Warehouse on large scale development efforts leveraging
industry standards using Talend and Informatica.
 Sound knowledge in Talend Studio and Informatica worked in Banking, Petroleum, Sales,
Healthcare & Insurance business sectors.
 Working on agile methodologies attending scrum call’s user and the team as well.
 Strong in Client Interaction, Requirements gathering, Requirements Analysis, Design, Programming,
Testing, and Debugging.
 Having experience on TAC (Talend Administration Centre) and TMC (Talend Management
Console)
 Experience in data warehousing dimensional modelling concept using star schema and snowflake
models.
 Interacted with users to gather Business Requirements and produce design documentation.
 Provided Knowledge Transfer to the end-users and created extensive documentation on the design,
development, implementation, and process flow of the mappings.

 Participated in all phases of development life-cycle with extensive involvement in the definition and
design meetings, functional and technical walkthroughs.

 Significant experience with Data Extraction, Transformation, and Loading (ETL) from disparate data
sources such as multiple relational databases and worked on integrating data from flat files (CSV,
JSON, text files and XML, JSON files) into a common reporting and analytical data model.
 Create mappings for dimension load and implemented as per ETL specification and enhancement or
changes of existing mappings or logics.
 Develop new modules and delegate tasks to team members as per project requirements.
 Involving Development, Testing, Implementation and Production support.
 Tuning the mappings to obtain the maximum performance benefits.
 Establishing best practice methodologies and documentation standards for ETL Job.
 Ability to work independently or in a team environment and ability to perform complex
troubleshooting, root-cause analysis, and provide an efficient solution.
 Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible
in work schedules, and possess good communication skills.
 Result oriented, innovative in approach, and enjoy learning new methods & ideas.

1
Work Experience:
Organization Dates
Capgemini India Pvt. Ltd (Bengaluru) March 2013 to Present
Societe Generale Global Solution (Bengaluru) June 2012 to Feb 2013

Technical Skills

ETL Tools Talend Studio (6.x & 7.x), Informatica Power Center (8.x,9.x)
Big Data Platform Snowflake Datawarehouse, Big Data Components for Distribution
(Horton Works), HDFS, Hive, Scoop.
Database Oracle 10g/11g/12c, SQL Server, Azure Synapse (ADLS, Storage)
Reporting Tool Good Knowledge of Dimension Modeling and Tableau
Cloud Computing EC2, RDS, S3, Glacier, SQS, SNS, CloudFormation, EBS, VPC, IAM,
AWS Route53, Azure blob storage
Other Tools Toad, SQL Developer, WinSCP, Putty, Pyramid, Tidal, Autosys, Azure
DevOps Boards

Education:
 M.C.A (Master of Computer Application).

Project Done:
Assignment-1
Period March 2020 – Till Date
Project Name Petronas – BI (EDH Scoreboard)
Project Type DEVELOPMENT
Environment Talend 7.2, Microsoft Azure Synapse Environment, Azure
blob storage, TAC
Role TALEND ETL LEAD

Brief description of the project:


The EDH scoreboard solution provides an enterprise-wide data lake store that caters to the Enterprise
Digital program's data needs. The key objective is to bring data closer to enable the discovery of relevant data
for business analysis. From the data source, it will be extracted, load and transformed into enterprise data in
form of a logical data model. This will also help to enable data quality reporting based on the profiling of data
attributes relevant to the EDH scoreboard provide key input into metrics and reports to measure the quality of
data against defined SLAs and data standards. These data are stored and transformed into a common model to
be accessed by digital program.

Responsible for:

 Monitor all business requirements and validate all designs and schedule all ETL processes and
prepare documents for all data flow diagrams.
 Absorb and adhere to architectural guidelines established by corporate policy.
 Work closely with Project Manager to develop and update the task plan for ETL work and to keep
the manager aware of any critical task issues and dependencies on other teams.
 Ensure the ETL code delivered is running, conforms to specifications and design guidelines

2
 Review of ETL design documents and working closely with Business/Architect team.
 Understands the range of options and best practices for common ETL design techniques such as
change data capture, key generation and optimization.
 Closely worked with Talend Admin and support team to set up Test and Production environment and
taking care Talend admin activities for entire team.
 Create and maintain Talend jobs to transform and load data into DWH tables
 Help with the analysis of issues raised during QA/UAT/PROD phases and Data Reconciliation.
 Onsite communications and coordination with clients on business requirements created
 Perform root cause analysis on all processes and resolve all production issues and validate all data
and perform routine tests on databases and provide support to all ETL applications
 Help with the analysis of issues raised during QA/UAT/PROD phases and Data Reconciliation

Assignment-2
Period AUG 2018 – Feb 2020
Project Name BOJ Phase I and II, Pyramid Global Support
Project Type DEVELOPMENT & Support
Environment Talend, ORACLE 11G, Tableau, Linux Box system /Big Data
Hadoop/Hive
Role LEAD ETL DEVELOPER
Brief description of the project:
This project deals with extremely complex ETL. Earlier third-party legacy system ISTAR is carrying out all
the regulatory reporting for Bank of Japan. As part of this project In BOJ (Bank of Japan), the regulatory
reports are generated for different Japan Regulators. The different frequency reports were built in CBR like
daily, weekly, monthly, quarterly and annually. This project is involved in the Creation of Data Warehouse for
my client who is one of the lead financial companies across the country. To keep track of sales and customer
support we have developed a Data Warehouse system. This helps our management to take better and
appropriate decisions well ahead of time. This project will facilitate our management to make future policies
about the company.
Responsible for:
 An active member in Requirement gathering and Data modeling team.
 Work with Bank to get the requirements and Identified the pain areas also Suggested prototype and
data model.
 Assigning tasks to the team members and helping the team in understanding the requirements
technically.
 Unit testing ETL code to ensure it can be delivered and run in a system testing Environment
 Design the End to End solution right from Requirement gathering to Deployment.
 Review of ETL design documents and working closely with the Business/Architect team.
 Attending domain calls and updating domain manager issues and concerns.
 Perform Functionality and Logical and integration testing for Mappings used.
 Configure the DDL to incorporate source system as per the Bank requirement.
 Deploy the jobs and to the scheduler and worked on defining the dependency.
 Having a clear-cut understanding as admin for Pyramid SG global tool.
 Working as L2 and L3 support and helping the team members in resolving the issue.

Assignment-3
Period OCT 2017 – JUL 2018
Project Name CBR (Central Banking Repository)
Project Type Support and DEVELOPMENT

3
Environment INFORMATICA 9.5, ORACLE 11G, Tableau, Linux Box
system
Role LEAD ETL DEVELOPER
Brief description of the project:
CBR Central Banking repository system is a corporate investment banking system where a different mode of
trades happened. We must report to DMO's (Debt Management Office’s) & central banks. Generate predefined
formulated reports to regulators. Calculate Volcker Metrics (Customer Facing Trade and Values, Turnover and
Aging) for each asset class at portfolio, GOP's and TD level. We are capturing that trade information and
generating a different type of Matrices for reporting purpose.

Assignment-4
Period NOV 2016 – OCT 2017
Project Name SPRM (Siebel Partner Relationship Management)
Project Type DEVELOPMENT and MIGRATION
Environment Talend / Informatica 9.5, ORACLE 11G, AIX, Sales Force
Role SENIOR ETL DEVELOPER
Brief description of the project:
SPRM is a Siebel Partner relationship management that collects the transactional information about partner’s
data from all various upstream systems, validates, processes and provides the quota countable data to a
downstream system to calculate the partner Business with HPE. HPE provides sales compensation to its
direct/indirect Partners and channel partners. This project is working on three different regions AMS (US
continental states), APJ (Asia Pacific region), and EMEA (Europe and Africa countries) across the globe.

Assignment-5
Period OCT 2015 – OCT 2016
Project Name DATA QUALITY AND CONTROL
Project Type Senior ETL Developer
Environment Talend, ORACLE 11g, Business Object, Unix, ALM, Toad,
Tidal
Role SENIOR ETL DEVELOPER
Brief description of the project:
Data Quality & Controls (DQC) project provides the ability to monitor Sales Comp data quality on an on-
going basis to proactively identify data quality issues, measure materiality, and resolve issues as quickly as
possible at an upstream level which will create accountability for Sales Comp data quality at the source data
provider. Key benefits of this program will be to reduce manual claims, improve turn-around time and increase
Sales Comp operations efficiency.
Assignment-6
Period MARCH 2014 – SEP 2015
Project Name SIQP
Project Type DEVELOPMENT & SUPPORT
Environment Informatica, Talend, Oracle 11g, Unix, ALM, Toad, Tidal
Role ETL DEVELOPER

4
Brief description of the project:
SIQP is an operational data store that collects the transactional sales data from the various upstream systems,
validates, processes and provides the quota countable data to a downstream system to calculate the
compensation for sales reps. HP Provides sales compensation to its direct/indirect sales representatives and
channel partners. The SIQP data mart collects the transactional order and ship data from the various 100
upstream systems, processes them and provides the quota countable data to an Omega system to calculate the
compensation for sales reps.

Assignment-7
Period MAR 2013 – FEB 2014
Project Name SALES IDS DW
Project Type DEVELOPMENT & Migration
Environment Informatica 9.5.1, Salesforce UI/workbench, Oracle 11g, SQL
SERVER, Windows, Toad, Tidal
Role ETL Developer

Brief description of the project:


SALES IDS DW is to maintain all the IDS information related to Users, Accounts, and record types, Groups,
World Regions, Organizations, Products, Contact, Campaigns and Opportunity related data. We are integrating
data and loading into the stage as Oracle and final database as SQL server to keep track of sales and respective
details data and customers support, we have developed a Data Warehouse system. We are publishing data
every two-hour using the Tidal scheduling tool. BO team used to prepare business reports that provide insight
into key data as per business needs.

Assignment-8
Period JUNE 2012 – FEBRUARY 2013
Project Name FINACLE INDIA
Project Type DEVELOPMENT
Environment INFORMATICA 9.5.0, ORACLE 11G, AIX
Role ETL DEVELOPER

Brief description of the project:


This Project is meant to deliver the EDW model extract data Finacle and non-Finacle sources and implement
the ADF (Automated Data Flow) product to submit the mandated return from RBI (Reserve Bank of India) or
the analysis of data for the business requirement of the client, a client has its own OLTP system BANK
MASTER. They needed a Data Mart so that they could maintain historical and current data in a central
location for integration, which could serve a purpose of a DSS for decision-makers.

(Pramod Kumar)

You might also like