Professional Documents
Culture Documents
Dipak Gopale
Data Engineering Lead
Summary:
Having experience as a Manager, Lead, BI Consultant, SAS programmer /SAS Developer/ SAS EBI Developer
in analyzing, developing, and implementing various applications in ETL and reporting tools.
Leading in ETL development with knowledge of software such as SAS and Informatica.
Responsible for the Extraction, Transformation & Loading of data from multiple sources into Data
Warehouse. worked on various sources like Sql, Oracle, Excel and Flat Files.
Expertise in working with SAS DI Transformations Append, Extract, Table Loader, Data Validation, SQL Join,
Surrogate Key Generator, SCD Type 2, Splitter and User written, File Reader, Lookup Transformations.
Extensive knowledge in data management like Merging, concatenating, interleaving and moving of SAS
datasets.
Experience on various procedures like Proc Means, Proc Freq, Proc Tabulate, Proc Import, Proc Export, Proc
Datasets, Proc Format, Proc Sort, Proc Report etc.
Experience to handle large data, coded effectively for scalable data using Pass through queries, Hash Table,
parallel processing, indexes etc.
Developed SAS macros for Data cleaning, Data mining and Reporting and to support routing processing.
Assignments for POC/Demo to clients (Bank & Airlines) & Prepared POC on client’s data using SAS Visual
Analytics, Data integration studio, SAS EG.
Good knowledge about the Deploying and scheduling the Jobs in SAS Management Console and run using
process flow manager.
Good experience in end-to-end Power BI Application implementation: Requirement analysis, Architecture,
Data modeling, Design, Development, Data loads, Testing, Deployment and support of Power BI
applications.
Experience in Scheduling the Power BI dashboards as per the required frequency.
Developed shell script to execute jobs and job flow in Zena, SKED scheduler.
Used change management (check in check out concept) for tracking changes. created project repositories
for users. Ability to learn and adapt new technologies quickly. Analytical and problem-solving skills.
Experience on Informatica PowerCenter 10.4
Involved in Proposal activities.
Skill:
SAS Programming:
BASE SAS, SAS/MACROS, SAS/SQL, SAS/PROC, SAS ODS, SAS, ETL, BI
Tools:
SAS Data Integration Studio 4.9, Informatica PowerCenter 10.4, SAS Enterprise Guide 7.1, SAS Information
Delivery Portal 4.3, SAS Information Map Studio 4.3, SAS Management Console 9.4, SAS Web Report Studio
4.3, SAS BI Dashboard4.3, OLAP Cube, SAS Visual Analytics, Power BI, Hyperion, SAS Customer Intelligence
Programming Languages:
PowerShell script, C, C++, java, Java script, PHP, Html, Unix
Database:
Oracle, SQL Server 2008, Teradata
Basics:
SAS Data Flux, Hadoop (HDFS, HIVE, Map Reduce, Pig, Sqoop, Hbase, Flume, Oozie), SSIS, SSRS, Pentaho,
MongoDB (NoSQL), Python, Tableau
LANCESOFT
Education:
MCA (Comp. App.), Pune University 2011-2014
BSc (Comp. Sci.), Pune University 2008-2011
Experience
Project:
PROJECT 1:
Client: Khan Bank, RHB Bank, KAF Digital Bank Jan2022–till date
Manager, EY, Malaysia.
Tools & Environment:
Digital Banking
EA Metrics
Project Description
Working on various banking project.
Role & Responsibility:
Engaged for a technical PM role to help manage the IBM DQ tool setup, responsible to coordinate the
activity between RHB IT and vendor (Affinitidata).
Assisted in reviewing DQ SAS scripts and translate it to SQL scripts.
Build enterprise architecture IT measurement metrics for Khan bank, Impact analysis on AML system due to
the duplicates.
Involved in various RFP/RFI proposal/pursuits activities.
Build end-end solution architecture and integration design for digital bank. Closely working with Temenos
technical team, system implementation team and client to implement digital bank.
PROJECT 2:
IFRS 17 Takaful Columnar Reporting
Client: AIA Mar2021–Dec2021
Manager, EY, Malaysia
Tools & Environment:
Informatica PowerCenter 10.4
Oracle
Project Description
This project implements the E2E local MFRS 17 Takaful Columnar Reporting including Disclosure report, Trail
balance and Posting SAP GL files.
Role & Responsibility:
Currently leading the DataMart workstream to implement the MFRS 17 Takaful Columnar Reporting
DataMart which includes Design, Build, Test, Deployment and Support.
Closely working with Accounting, Actuarial, Client and internal DataMart team.
Validating columnar data requirement, defining business logic and DataMart design, develop high level
solution architecture, design FSD, TSD, build takaful Datamart using Informatica PowerCenter, develop test
scenario, execute test cases.
Involved in test management activity, worked on JIRA, handled defect debrief call with client.
PROJECT 3:
IFRS 17 SAS Subledger Solution
Client: Tokia Marine Insurance July2020–Feb2021
LANCESOFT
PROJECT 4:
SAP-NXP ATKL
Client: NXP Semiconductor Dec2018–Mar2020
BI Consultant, Accion Labs SDN BHD
Tools & Environment:
Power BI
Tableau
Hyperion
PowerShell Script
Power update
Pentaho
Sqldeveloper64W
Teradata SQL Assistant
Power Query
Microsoft flow
Microsoft Azure
Project Description
This project involves migration from Legacy-SAP systems to new SAP system to achieve one standard system
across all the sites.
Create reports to be used by Planning and Manufacturing team.
The reports help teams to forecast the Demand and Due Date for lots.
Role & Responsibility:
Worked with the EBI and Manufacturing global team (ATKL, ATBK, ATKH, ATTJ).
LANCESOFT
Responsible for requirement gathering, POC, designing, developing, and delivering Microsoft Power BI
reports.
Developed line and stacked column chart, used drill down-up, drill through, summarized data to show the
machine status, total worked duration in percentage.
Refreshing the power BI datasets using power shell script (API), DAX and Using power BI scheduler on cloud
(SaaS), sharing reports to users using Power BI APP, SharePoint.
Developing the flow for email alerts in a Microsoft Flow.
Build the jobs in Hyperion, Scheduling the jobs in Microsoft task scheduler.
Developing reports in Power Query to create ODS (operational data store) files.
Scheduling and transferring the Power query reports using Power Update.
PROJECT 5:
COAP (Collection Optimization Analysis Project)
Client: RHB Bank July 2016 to Nov 2018
SAS ETL-Developer, Tentacle SDN BHD
Tools & Environment:
SAS DI Studio
SAS EG
SAS Management Console
Visual Analytics
SAS Customer Intelligence
Process Flow Manager
Zena Scheduler
Winscp
Toad
Fusion Invest
VM Vsphere Client
Project Description:
This project is for the RHB bank to identify the delinquent and non-delinquent accounts of personal loan, credit
card, mortgage, ASB, AF customers.
Role & Responsibility:
As an ETL Developer Working with the COAP Team and responsible for ETL, Data Gather, Analyze Data,
perform data mapping and confirm the data requirements with various stakeholders/ business users.
Process data by using business logic and generate daily/monthly Data mart in SAS data integration studio.
Developed report in SAS Visual Analytics environment to analyze customer score, rank, channel, status of
customer account.
Generate batch script program using customer intelligence environment, it optimizes the data mart.
Developed EG project to test the data row count.
Involved in LSF activity on SMC, trigger a flow using process flow manager, created shell script, batch flow
and executing on ZENA scheduler.
Create table mapping and structure using toad, insert record and create view, triggered procedure and
function.
As per requirement extract oracle table using business event, portfolio and schema.
Extract .csv file as per required data date from oracle table.
Apply the template on .csv file and create an xml file, upload xml to UAT server using vm vshare client.
Test data in fusion invest system, using CSV and XML file.
LANCESOFT
PROJECT 6:
SWIFT (System wide information fast tracking) Reporting
Client: Malaysian Airlines July 2015 to Apr 2016
SAS Developer, TCS, Malaysia
Tools & Environment:
SAS DI Studio
SAS Management Console
SAS EG
SAS Visual Analytics
Process flow manager
Filezilla
windows 7
Project Description
This project is for MAS to calculate the revenue of each flight, analyze the daily NIAT (Net income after tax), send
detail and summary data to users.
Role & Responsibility:
As SAS Developer I was responsible for ETL. Source data is coming from DB2, .txt file, .csv file.
Worked with Swift Team, Understand Business logic and mappings to develop automated jobs in SAS DI.
Generate detail file, summary file, Daily NIAT report and send report to users via mail.
Generate graphs and reports using proc gplot and proc report.
Created libraries, registered tables, connected to Oracle database using SAS MC, used publishing framework,
Scheduling manager, User manager, server manager.
Understanding the business requirement of external source systems and performing Data profiling cleaning,
standardization, and remediation and Create Reports/Dashboard in SAS VA.
Schedule and deploy jobs using SMC and execute flow using LSF.
PROJECT 7:
Branch Sales Reporting
Client: Hong Leong Bank Mar 2015 to July 2015
SAS DI/BI Developer, Thakral one SDN BHD
Tools & Environment:
SAS DI Studio
SAS Management Console
SAS Information Map
SAS Web Report Studio
Windows 7
Role & Responsibility:
As an SAS DI/BI consultant I was responsible for loading data from source to target table, used co-related
subqueries, different types of joins (implicit, explicit) in SQL join transformations.
Used pre-code and post codes for retrieving particular month data from source, applied some expressions in
the join table.
Using information map studio create imap, in imap use filter, prompt, pre-filter, cascaded prompt, creating
new data item and change expression.
In the web report studio used different objects like crosstab, bar chart, list table, text, header, footer, title
etc. in the web report used group break, total, subtotal, report linking, filter, conditional highlighting etc.
Using EDW script create a Volatile table for Credit card and retrieve current month record, previous 3-month
record.
LANCESOFT
PROJECT 8:
Patient Account and Diagnosis Data loading
Client: Pfizer Feb 2014 to Mar 2015
SAS consultant, Mphasis
Tools & Environment:
SAS DI Studio
SAS Management Console
SAS Information Map Studio
SAS Web Report Studio
SAS Information Delivery Portal
SAS BI Dashboard
Linux, windows 7
Oracle 10g, MS SQL Server
Role & Responsibility:
As an SAS consultant I was responsible for Loading data for patient, diagnosis, geography, patient age,
gender, claim status, physician, drug dim and fact for Rx, TRx, Dx facts.
Development of DI jobs for loading data from multiple sources like Oracle, MS SQL Server, and flat files to
Data warehouse.
Created exception table using data validation, splitter Transformation.
Developed jobs with job parameters. Run jobs in loop using loop transformation to execute job in iteration
Used performance techniques to improve job efficiency, used indexes, increase sort size, used multi-
threading in job, used loop transformation, removed un-necessary columns from processing while mapping.
Provided feedback and design changes that are necessary to lead architecture.
Created cubes using SAS OLAP cube studio.
Conducted knowledge transfer session with peer and support team to handshake of knowledge.
PROJECT 9:
Backend Automated reporting System
Client: Global Health care Alliance Jan 2013 to Jan 2014
SAS Programmer, Mphasis
Tools & Environment:
SAS Base
SAS Macros, SAS ODS, SAS SQL, SAS Access
SAS Information Map Studio
SAS Web Report Studio
SAS Information Delivery Portal
Unix, Oracle 9i
Role & Responsibility:
As an SAS Programmer I was responsible for developing reports using SAS programming in PDF format.
Reports includes monthly adjustments, claims, physician patient reports scheduled by weekly and monthly.
Used SAS macro extensively, used SAS ODS PDF.
Connected to oracle database using pass through facility.
Used hash table for look up, used parallel processing concept using sas connect.
LANCESOFT
Created summary and detail log for process, created script to execute in batch.
Also created reports to run on adhoc basis using web report studio, used information map studio to create
map, used pre- filters, sections, groups in the reports, Used cascaded prompts.
Develop reports using SAS web report studio, created different types of reports like summary detail, drill
down, drill up, used report linking feature, created reports with different sections, created reports with
filters prompts, conditional highlighting.
Used advance reporting concepts like “Report busting” using publishing framework channel.
Managing code execution on Unix environment, looking through log, messages etc.