You are on page 1of 7

Curriculum vitae

Summary:

 Around 11+ years of IT professional in the IT industry with Specialization in the field of
Data Warehousing, and was completely in charge of day-to-day deliverables during
development.
 Having 7 years of experience on ETL Development in Information technology using IBM
WebSphere / Info Sphere Data Stage 8.5/8.7/9.1/11.3 and Extensively worked with Data
Stage Designer, Director, complete Software Development Life Cycle(SDLC) experience
with system design, development, implementation, testing
 Having 4 years’ experience in Data Profiling and Data Quality using IBM Information
Analyzer and Quality Stage 11.7
 Having 1 year experience on Trillium Control Center 14 & Informatica Power Center
 Designed and developed parallel jobs, server and sequence jobs using Data stage
Designer. Experience in using different types of stages like Transformer, Aggregator,
Merge, Join, Lookup, Sort, copy, Remove duplicate, Funnel, Filter, Pivot, Shared
containers for developing jobs.
 Worked and extracted data from various data sources such as Oracle, DB2, XML and Flat
files.
 Good Knowledge about the principles of DW like Data marts, OLTP, OLAP, Dimensional
Modeling, fact tables, dimension tables and star/snowflake schema modeling.
 Extensive experience in Unit Testing, Functional Testing, System Testing, Integration
 Created local and shared containers to facilitate ease and reuse of jobs..
 Experience Good Experience in Banking Domain
 Quick learner and adaptive to new and challenging technological environments.
 Having good experience with CICD deployment tools Team City & Octopus and GIT Hub

Experience:

 Working as a Senior Consultant in Encora Innovative Labs Private Limited from


January 2021 to till Date
 Worked as a ETL Team Lead in Accenture Solutions Pvt Limited Bangalore from Oct
2016 to Oct 2020
 Worked as ETL Developer in IBM India Pvt.Ltd ,Hyderabad from Jun’14 to Oct2016
 Worked as ETL Developer in SARAL INFOTECH SYSTEM from Mar2010 to Jun2014

Technical Skills:

ETL Tool : Data stage 8.2/8.7/9.1/11.3,11.7


Informatica Power Center 9.1
Data Quality Tools : Quality Stage 8.2/8.7/9.1/11.3,11.7
Languages : SQL, PLSQL, C, C ++
Database/Applications : Oracle 8i/9i/10g,DB2
Operating Systems : MS DOS, Windows NT/2000, Sun Solaris 4.0
Packages : MS Suite
GUI Tools : Toad 7.6/9.5, Oracle SQL Developer
Curriculum vitae

Profiling Tools : Trillium Control Centre, Information Analyzer


Scheduling Tools : Autosys
CICD Tools : Team City, Octopus, Git Hub

Educational Qualification:

M.C.A from JNTU during 2009, Hyderabad

Major Assignments:
Assignment # 1
Sales Lead Processing Tool
DHL
July 2021-Till Date
Role: Senior Consultant

SLPT (Sales lead processing tool) is an ETL application which procures the information about
leads (potential customers to DHL), classifies them as either Suspects (a new lead) or
development leads (a lead linked to an existing DHL customer) and delivers them to COMET
(DHL Express Sales CRM) for loading. SLPT runs every 15 minutes .
Current Scenario:
All Suspects are assigned to a default territory of the respective organization in
the case no dedicated territory is provided by the source.
The COMET system will then assign the Suspects in a Round Robin logic to the LQ’s associated
to the default territory (Country specific configuration).
If a territory is provided /generated based on country rules, the same is delivered to COMET
Change Statement:
UK has requested for a change to existing functionality so that SLPT can directly control the
suspect assignment in COMET based on LQ team member availability in conjunction with last
Suspect assignment

Responsibilities:

 Interacted with Business users and Technical Architects to analyze the data &
gathering the requirements from various sources.
 Involved in daily meetings with the client for requirements and provide services to
meet the required SLAs.
 Created SRS and Design Documents and Unit Test cases with test Results
Documents.
 Developed JOB using Data Stage for Data Cleansing, Extraction and
Transformation.
 Created Logical process flows and physical process flows of business requirements
using Visio.
 Implementing performance-tuning techniques along various stages of the ETL
process
 Participated in the review of Technical, Business Transformation Requirements
Document.
 Involved in providing Support to the built Application.
 Involved in Business Requirement gathering sessions
 Involved in Unit, Integration, system, and performance testing levels and Involved in
Performance tuning at source, target, jobs and system levels and Testing Jobs with
Unit Test plan.
 Integrate the data from the Source system into the target database.
Curriculum vitae

Environment: Datastage&QualityStage11.7, Linux,

Major Assignments:

Assignment # 2
Omni Channel Customer Identity
DHL
Jan 2021-June 2021
Role: Senior Consultant

Omni channel program aims to establish person identity which could be utilized to uniquely
identify individuals during interactions  within the DHL Express framework.in Phase1 for pilot run
As CSV now will flow the email address to ANIDB and ANIDB flat files as a source and Quality
stage process will identify the deduplication that business is agree and profiling.
The scope for the ETL Layer is to De-duplicate the data from source applications (e.g. ANIDB)
then send the file to consumer applications i.e. Customer Service Management.

Responsibilities
 Responsible for understanding the business requirements and designing and building
applications according to the requirements
 Working with SME in gathering all requirements, and collecting all required information
for development.
 Worked with Investigate to identify the potential anomalies in source system and used
Standardize stage to Standardize the Name and address data
 Worked with Match Designer tool to create Mass Passes which is required for to call in
Match Frequency stage and One-source and two source match stage
 Worked with One-source match stage to do identify the duplicate records from source.
 Involved in the preparation of System requirement Specification document.
 Provide the technical assistance to the team whenever if team needed the support

Environnent: Datastage&QualityStage11.7, Information Analyzer11.7, Unix,

Major Assignements:

Assignment # 3
Data Load Match Application (DLMA)
Common Wealth Bank of Australia
Jun 2019-Sept 2020
Role: Team lead

Data Load and matching application provides the apply the standardization, validation and
parsing rules and report validation/exception errors the for product systems. Provides the ability
to identify and to provide updates to the SAP for New customers, changes to the existing
customers, address details, new accounts, changes to the existing accounts, parties linked to
new accounts and changes

Responsibilities:
 Responsible for understanding the business requirements and designing and building
applications according to the requirements
 Working with SME in gathering all requirements, and collecting all required information
for development.
 Design, Develop and Unit Test Complex ETL jobs. Performance tuning of ETL jobs.
 Involved in analyzing the quality of the jobs developed by the team members and
Curriculum vitae

providing the suggestions to improve the performance. And did the Performance Tuning
 Used Data Stage sequencer jobs extensively to take care of inter dependencies and to
run data stage server/parallel jobs in order
 Working with peers to follow the business software process
 Provide technical assistance to team and evaluate all codes.
 Coordinate with team members and administer all onsite and offshore work packages.
 Worked in designing Job Batches and Job Sequences for scheduling parallel jobs using
UNIX scripts, Autosys Jobs Jils and Data Stage Director.
 Wrote Extensive Unix scripts for running the Data Stage jobs.
 Developed Data Stage job sequences used the User Activity Variables, Job Activity, Wait
for File stages, Execute Command, Loop Activity, Terminate
 Developed Error Logging and Auditing strategies for the ETL jobs.
 Used Data Stage Parallel Extender stages namely Datasets, Sort, Lookup, Peek,
Standardization, Row Generator stages, Remove Duplicates, Filter, External Filter,
Aggregator, Funnel, Modify, and Column Export in accomplishing the ETL Coding.
 Wrote Release notes, Deployment documents and scheduled the jobs via the Autosys.
 Involved in code deployment process from DEV to ST and E2E environments using
CICD tools Team City and Octopus and Git Hub
 Involved in preparing TDD and Autosys batch flow documents preparation
 Involved in giving support to the Prod support team during code deployment .

Environnent: Datastage&QualityStage9.1, Information Analyzer9.1, Auto Sys, SQL, Unix, Oracle


10g,SAP UI

Major Assignments:

Assignment # 4
Nordea Collateral Solution
Nordea
Oct2016-May 2019
Role: Team Lead

Data will be extracted from the current Legacy Systems in the four Countries (Norway, Finland,
Denmark, Sweden) and then via Extracted, Transformed and Loaded setup consolidated into a
shared DB of COLLATE, where it will be separated into different entities need as per the
functional requirements, separation expected to be on LEGAL entities to match COLLATE
structure.
The Migration setup architecture will be done in a way that ensure integrity and validity when
producing the dataset for the target system load, whereas target is the migration tables of the
COLLATE DB model. Validation is done within the cleansing phase, in the migration phase and
finally in the Cut-Over phases. Validation will be done to ensure that data meets the requirements
and expectations in the given phase.

Responsibilities:
 Involved in understanding Business Process and coordinated with Business Analysts to
get specific user requirements.
 Extracted data from sources like Oracle and Flat Files.
 Developed End to End Jobs from Legacy to Staging and Staging to Work Staging and
from work staging to work target and work target to target and target to MIG
 Developed Data stage parallel jobs based Mapping document
 Developed UNIX shell scripts DCM MIG Utility script and Live Utility script which loads
the data from source to until MIG
 Performed the Unit testing for jobs developed to ensure that it meets the requirements.
Curriculum vitae

 Designed and developed parallel jobs using Data stage Designer.


 Experience in using different types of stages like Transformer, Aggregator, Merge, Join,
Lookup, and Sort, Remove duplicate, Funnel, Filter, Pivot, Shared containers for
Developing jobs.
 Involved in performing unit testing
 Used diverse partitioning methods like Auto, Hash, Same, Entire etc.
 Involved in performing data profiling using IBM Ionosphere Information Analyzer

Environment: Datastage&QualityStage11.3, Information Analyzer11.3, SQL Developer, UNIX,


Oracle 11i

Assignment # 5
BPaid Migration
Barclay Card, UK.
Feb2014– Oct2016
Role: ETL developer
The Global Payment Acceptance (GPA) business unit within Barclaycard is responsible for card
acquiring services offered to its merchant customers across the UK and continental Europe. They
cover about 38% market share in UK. The Barclaycard’s current portfolio of acquired merchants
is principally hosted across two processing systems, namely Darwin and CAMSII. There are
approximately 300K merchants being managed on these systems with a degree of
interdependence existing between the two. They are to be replaced by a newly established
installation of Bank WORKS which is to be hosted and managed internally by Barclaycard. The
key drivers for data migration are a combination of Business and Operational Needs, Compliance
Needs and Data Needs of the Target System

Data migration between these incumbent systems to the new Bank WORKS installation which
will become the core processing engine for all merchant acquiring business and TCV MDM which
handles the master data for customer information; allowing the decommissioning of Darwin and
CAMS II.

Responsibilities:

 Responsible for managing scope, planning, tracking, change control, aspects of the
project.
 Translate customer requirements into formal requirements and design documents,
Establish specific solutions, and leading the efforts including programming and testing
that culminate in client acceptance of the results.
 Responsible for business requirements gathering /discussions and analysis, also adhere
to customer policies and standards.
 Development of mapping specifications/ job designs/ SQL queries / UNIX scripts.
 Involved in conducting technical discussions for onboarding suitable ETL resource into
the team.
 Involved in Unit/Regression /Integration testing, issue Analysis & preparation of test case
documents.
 Developed queries which can create hierarchy of the Merchants.
 Coordination with team and support them for business understanding & implementation.
 Supporting different teams to in the account for any ETL issues/solution implementation
designs/job development.
 Responsible for knowledge transfer to team members on Functional knowledge for the
BOC and Bank works module.
 Worked on the profiling of source files using Information analyzer
Environment: Oracle 11g, Informatica 9.1.1, UNIX, Trillium Control Centre 14, HP Quality
Center10
Curriculum vitae

Assignment # 6
Blue Harmony
IBM
Mar2010– Dec2013
Role: Data profiling analyst
International Business Machines (IBM) is implementing SAP R/3 ERP Core Component (ECC)
6.0 for Finance (FI), Controlling (CO), Materials Management (MM), Project System (PS), Work
Flow (WF) and Sales and Distribution (SD). The Blue Harmony Global SAP Implementation
Project is an IBM initiative that will replace legacy ERP applications across the entire company
with one common SAP ERP platform. The key objective of the Blue Harmony SAP Global
Implementation Project is that the single SAP platform will utilize common global business
processes and best practices to allow IBM to achieve business benefits throughout the globe.
The approach that IBM will follow to achieve this objective is to conduct the Blue Harmony SAP
Global Implementation Project in two Waves that take advantage of core competencies and focus
on businesses and business applications that can provide significant return on IBM’s investment. 
Data Integration Team is one of the sub teams in Blue Harmony, this Team develops Objects
using Data stage Tool with SAP Pack. It receives data from Legacy Systems, Loads the
transformed data in to SAP application in the form of IDOCs, BAPI, ABAP etc.
Responsibilities:
 Understanding the business specification documents and customer requirements.
 Involved Extracting, Transforming and Loading data from sources like Flat files,
Relational databases and placing it into respective targets.
 Used most of the stages such as Sequential file, Aggregator, Sort stage, Join stage,
Look-up stage, Funnel stage, Copy stage, Filter stage and Transformer stage,DB2
Connector stage etc.
 Involved in creation of source to staging, Staging to alignment, alignment to preload jobs
 Analyzed the customer source data and designed Quality Stage Jobs for Standardization
Matching and Survivorship to achieve a single view of the customer based on the
business rules.
 Effectively used Standardize Stage in Standardizing the Source data by using the
existing Rule sets like Name, Address etc.
 Effectively used Data rules stage implementing business rules to validate source data
 Involved in creating Data stage job sequence jobs for running IA data rules through
data stage automation jobs
 Analyzed the customer source data and designed Quality Stage Jobs for Standardization,
Matching and Survivorship to achieve a single view of the customer based on the
business rules.
 Effectively used Standardize Stage in Standardizing the Source data by using the
existing Rule sets like Name, Address etc.
 Involved in developing an approach to standardize other countries Names and Addresses
Customized to US Standards. The new Standardizing process was developed because
there were many anomalies in Names and Addresses between countries like US, UK,
Canada and China.
 Expertise on AVI Stage to validate the address information coming from the Source and
generate the Valid and Invalid address reports.
 Information Analyzer (either Single Key Column or Multiple Columns / Natural Keys).
 Used IBM Information Analyzer for Column Analysis, Primary Key Analysis and Foreign
Key Analysis to develop a detailed Data Profiling report.
 Built several Sequencer jobs using stages like Job Activity, Notification Activity, Routine
Activity, Execute Command, User Variable Activity, Sequencer, and Wait for File Activity,
Terminator Activity, Start Loop Activity and End Loop Activity stages.
Environment: IBM Data stage 8.7 with SAP Packs, Information Analyzer 8.7, Quality stage8.7
Database: DB2; File System: AIX, Clear Quest
Curriculum vitae

You might also like