Professional Documents
Culture Documents
SKILLS SUMMARY:
Extensively used ETL methodology for supporting Data Extraction, transformation and loading
processing, in a corporate-wide-ETL Solution using Informatica- Power Center, Power Exchange
and Data Quality.
Knowledge on Power BI with DAX.
Knowledge on Tera Data
Hands on Experience in Oracle 8i, SQL and PL/SQL.
Hands on Experience in flat files and extracting data from flat files & DB2.
Data Quality checking using Informatica Data Quality.
Design of Informatica mappings based on the business requirements.
Building of new mappings/enhancements in the mappings and Plans using
Power Center and Data Quality.
Integrating Informatica DQ Plans with Informatica Power Center.
Creating Reports Using Informatica DQ & Power Center Data Analyzer.
Analyzing Test Results & Test Cases of Informatica Mapping.
Involved in self review and peer review of deliverables.
Checking the Data Base using tools/queries to check for successful transactions.
Cleansing & standardizing data with Informatica DQ.
Involved in Preparation of ETL Reference Documentation.
Involved in self review and peer review of deliverables.
Detailed Profile:
Period from August 2015 To: August 2018
Client MUFG Securities, New York
Project Mitsubishi ETL Project-02
Role Delivery Senior Software Engineer (Team Member)
Technology Oracle 9i, Informatica9x, MS-SSIS
Description:
MUFG Securities provides a range of securities services to government agencies, corporations, and
institutional investors. It specializes in global sales and trading in fixed income and equities, debt and
equities financing, and investment research. The company assesses and manages risk, trades securities
and structures, and executes transactions; underwrites and trades bonds from a range of issuers,
including private and public sector corporations, banks, and government institutions; and offers credit
risk hedges and transaction services to help improve balance sheet for corporate and financial
institutions. It also provides specialized trading services for the U.S. and European government bonds,
foreign currency denominated government-guaranteed bonds, mortgage-backed securities, emerging
bonds, swaps and options, and listed derivatives. It also offers in-depth analysis of companies and
industries, markets, and economies to investors; and equity research and fixed income research services.
As part of the Enhanced Prudential Standards (EPS) reporting and governance requirements of the
Federal Reserve, Mitsubishi group needs to report Comprehensive Capital Analysis and Review (CCAR) a
consolidated set of financial reports for all its entities in the US. In view of that, MUFG has created a
holding company called MUAH which will consist of MUB+ MUS + other 7 smaller entities which are
based in the US. MUS needs to send all the data in a prescribed format across to MUB so that same can
be reported to Fed as MUAH. Following req. reports must be deliverable by the end of project 14A, 9YC,
2052a.
Responsibilities:
Extracted data from various sources like Oracle, flat files and SQL Server.
Loading and transforming large sets of structured, semi structured and unstructured data
Worked on managing the data flowing from different sources systems in to Staging Environment
and validated the Transformations Logic to write the SQL Queries to Validate the Data in Source
Systems.
Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to
target systems using MS-SSIS Packages
Participated in the Analysis, Design and Development Phases of report development,
performance tuning and production rollout for every report of Information Technology
Department.
Document all data mapping and transformation processes in the Functional Design documents
based on the business requirements
Involved in predefining data mapping from ETL & BI to the relational source, and running ad-hoc
queries for providing optimized reports.
Worked with Agile Methodology
Involved in the Meetings scheduled in order to discuss the documented Defects logged by the
testing.
Worked on ETL for understating mappings for dimensions and facts.
Worked on Validating the RWA (Risk Weight Assets) calculations for all the Counterparties for
Credit Risk and Market Risk Applications.
Worked on Validating the RWA’S for Mortgage Backed Securities (MBS) to calculate the Risk and
validated the Fed-2052a report against the Securities Buy/Sell Transactions.
Data validations are performed for each of the netting set for associated with Repo, Reverse
Repo, Securities Lending and Securities Borrowing Transactions.
Worked on daily basis with lead Data Warehouse developers to evaluate impact on current
implementation, redesign of all ETL logic.
Owned the assigned reports, worked on them and updated the Report Development Scheduler
for status on each report.
Queried the databases, wrote test validation scripts and performed the System testing.
Worked with the users to do the User Acceptance Testing (UAT).
Period from August 2014 To: July 2015
Client Symantec
Project Information Management Business Intelligence-Sustain
Role Delivery Software Engineer(Team Member)
Technology Oracle 9i, Informatica9x, Data Quality, Teradata
Description:
Symantec implemented the PLAN-BUILD-RUN model for planning, development, deployment and
maintenance of application systems to ensure business continuity and scalability. Information
Management Business Intelligence-Sustain is a project which supports 90+ applications running in
production environment and handling the ad-hoc requests which come from Business users.
The Operations group is in-charge of day-to-day maintenance and resolution of all production issues with
the EBI applications. The EBI infrastructure consists of data extraction and loading workflows built in
Informatica and reporting services built in Business Objects. Various data extraction work flows are built
for loading data into data marts, for loading sales data from vendors, operational data from application
service providers and customer data extraction for external vendors/service providers.
Roles and Responsibilities:
Involved in Requirements gathering from the business Users and developed mappings, Sessions,
workflows using Informatica Power center .
Developed Data standardization rules using IDQ ,the same IDQ plans are exported as mapplets to
power center.
Mainly involved in loading of all data into data marts and giving the status reports for the
business users.
Monitor the support mail box for different user requests and track the job failure notifications
sent if any job from Informatica monitor is in error state or failed.
Communicate with different teams for Down Time for production servers.
Interacting with developers, DBA and Application Support Team for resolving the issues.
Responsible all Production Monitoring activities with Top Priority cases and Track the activities by
logging tickets IN BMC Remedy.
Updating the tickets periodically with the progress on the issue and closing the tickets once the
issue is resolved.
Monitor the support mail box for different user requests and track the job failure notifications
sent if any job from Appworx in error state.
Responsible for failure of all jobs and resolving the issues with prompt actions.
Understanding the business requirements.
Develop the mapping for Source to Staging as per ETL specification.
Developed task like command, session and workflow using workflow manager.
Involved in self review and peer review of deliverables.
Involved in Preparation of ETL Specifications and Unit Test Case Document.
Involved in Preparation of ETL Reference Documentation.
Used Workflow Manager for Creating, Validating, Testing and running the sequential and
concurrent Batches and Sessions and scheduling them to run at specified time with required
frequency.
Description:
Client have a legacy systems like CMA (customer master and alignment) which contains all the necessary
information regarding Client customers, Client staff, sales force and areas of deployment of sales force
etc which is used for business purpose.
CMA contains information which should be made available to various other child systems like
RDS,CCS,MPS,ORIEN etc which are used for various business purposes. During the flow of information
from CMA to other system which is called sync up process the parent system has to remain offline which
means no changes can be made to its data unless the information is flowed to child system. Once the
business day ends ,we start the flow of data from CMA to ODS (ODS is a repository of necessary data
which flows from CMA and then flows to other sub systems ) and Enhancement of True Comp, External
Vendors IMS US,IMS Canada and Wyeth.
Used ETL to extract and load data from Oracle, Flat files, Excel sheets to oracle and flat files.
Building of new mappings/enhancements in the mappings and Plans using Power Center and
Data Quality.
Integrating Informatica DQ Plans with Informatica Power Center.
Extensively used the Expression, Aggregator, Router, Sequence generator, Lookup
transformations, Update strategy, filter and Store Procedure Transformations.
Creating the Cron tab UNIX jobs scheduling as per requirement.
The GCE Data Migration work stream is to make use of a combination of technologies around the
extraction, transformation and loading of source system data into the Operational data store and the
Data warehouse. The Business has a need to reflect the changes occurring on legacy system (CCMS)
on to the GCE target system. The functionality of this is to identify any record level changes occurring
at the legacy system end and reflect the changes at the GCE target end in a real-time mode and that it
is a Change Data Capture system. The record level change in legacy system could be a position change,
which could have been newly inserted or updated or deleted. Hence this is able to identify these
changes and reflect on the GCE target model in real time.
Description:
The client Blue Shield of California, is an independent member of the Blue Shield Association, is a not-for-
profit health plan (health Insurance) dedicated to providing Californians with access to high-quality
health care at a reasonable price.
Informatica is used to load data into the newly developed Rosetta environment which has data
marts and databases for various applications. Also Informatica is used for non Rosetta environment which
has data marts used for decision support, databases for to be used by VRU, web portal and some other
applications. Data is provided from the Main frame files, flat files, oracle databases, third party
applications and is moved to a staging area and finally to the target based on the application
requirement.
Mainly involved in loading of all data into data marts and giving the status reports in the morning
for the business users.
Monitor the support mail box for different user requests and track the job failure notifications
sent if any job from Informatica monitor is in error state or failed.
Communicate with different teams for Down Time for production servers.
Interacting with developers, DBA and Application Support Team for resolving the issues and
Responsible all Production Monitoring activities with Top Priority cases by EDS and Track the
activities by logging tickets IN PIV.
Updating the tickets periodically with the progress on the issue and closing the tickets once the
issue is resolved.
Performed data manipulations using various Informatica Transformations like Joiner, Expression,
Lookup, Aggregate, Filter, Update Strategy, and Sequence Generator etc.
Develop the mapping for Source to Staging as per ETL specification.
Developed task like command, session and workflow using workflow manager.
Understanding the business requirements and developed mappings.
Involved in Preparation of ETL Specifications and Unit Test Case Document.
Used Workflow Manager for Creating ,Validating, Testing and running the sequential and
concurrent Batches and Sessions and scheduling them to run at specified time with required
frequency.