You are on page 1of 7

FIRSTNAME LAST NAME PHONE.

NO
ETL DATASTAGE DEVELOPER E-MAIL ID:

PROFESSIONAL SUMMARY:
 Around 7 years of experience in Data Warehousing in the areas of ETL Design and Development.
Proficient in stages of Software Development life-cycle including System Analysis, Design,
Development, Implementation, Production Support and Maintenance.
 Extensive experience in Extraction, Transformation and Loading (ETL) data from various sources
using IBM Websphere DataStage. Worked on DataStage client tools like DataStage Designer,
DataStage Director and DataStage Administrator.
 Strong understanding of the principles of Data Warehouse using fact tables, dimension tables and
star/snowflake schema modeling.
 Excellent knowledge in Extraction, Cleansing and Modification of data from/to various Data
Sources like Flat Files, complex files, Sequential files, Comma Delimited files (.csv), XML and
Databases like Oracle, MS SQL Server, DB2, Teradata etc.
 Developed efficient mappings for data extraction/transformation/loading (ETL) from different
sources to a target Data Warehouse.
 Familiar in using highly scalable parallel processing infrastructure using parallel jobs.
 Experienced in scheduling sequence using DataStage Director, UNIX scripts.
 Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL Processes for
Data Warehouses.
 Migrated parallel jobs from DataStage 8.5 to DataStage 11.5 as per requirement.
 Involved in DataStage migration and backup from Version 8.5 to 11.5 and modified the all
UNIX/scheduling scripts as per DataStage 11.5.
 Used Enterprise Edition/Parallel Extender stages like Datasets, Sort, Remove Duplicate, Join,
Lookup, Merge, Change Capture, Funnel, Row Generator and many other stages in accomplishing
the ETL Coding.
 Extracted the data from the mainframe applications which is the hierarchy file system by using the
complex flat file stage.
 Worked with and extracted data from various database sources Oracle 10g/9i/8i, DB2, MS SQL
Server, Teradata, Sequential files.
 Developed the DataStage jobs Deals with binary data as well as fixed length data from mainframe
systems.
 Worked with the federation server stage to extract and load the data in to the mainframe
applications.
 Experience in troubleshooting of jobs and addressing production issues like performance tuning
and enhancement.
 Expertise in Unit testing, Integration testing, back-end testing and maintenance.
 Expert in performance analysis of the jobs, queries and performance enhancement and tuning of
jobs.
 Assisted in development efforts for DataMart’s and Reporting.
 Experience in new enhancements in the IBM WebSphere DataStage - XML, Json, MQ, Web service,
Rest API and expert in creating XSD, WSDL.
 Experience on IBM Infosphere DataStage Administration tasks.
 Experience on IBM Infosphere DataStage Version Upgrades.
 Experience working with Azure SQL Database Import and Export Service.
 Experience Microsoft Azure date storage and Azure Data Factory, Data Lake.
 Technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and
dimension modeling for OLAP.
 Extensive experience in writing Triggers, Packages, Stored procedures and Functions using PL/SQL
and T-SQL.
 Demonstrated experience in writing complex T-SQL queries as per requirements.
 Exposure in Microsoft Azure Machine Learning.
 Experienced in using various stages like Join, Merge, Lookup, Remove Duplicates, Sort, Filter,
Funnel, Dataset, Change Data Capture, Switch, Modify, and Aggregator in DataStage designing.
 Worked with SQL, SQL*PLUS, Oracle PL/SQL Stored Procedures, Triggers, SQL queries and loading
data into Data Warehouse/Data Marts.
 Implementation of Healthcare applications using FACETS.
 Excellent experience in developing and understanding of HIPPA EDI transaction sets like 837P,
837I, 837D, 270/271,276/277 and ANSI X12 EDI Standards.
 Experience in EDI formats like X1, EDIFACT, XML.

TECHNICAL SKILLS:

Programming: SQL, PL/SQL, T-SQL, UNIX Shell Scripting, HTML, DHTML, C

Database: MS SQL Server, Oracle, Teradata, DB2.

ETL Tools: IBM Info Sphere Data Stage 8.5/9.1/11.5, IBM Web
Sphere Data Stage 8.0.1, (Designer, Director,
Administrator)

Browsers: Fire Fox, Google Chrome, Safari.

Operating Systems: Windows, UNIX, Linux.

Office Tools: MS PowerPoint, MS Word, MS Excel.

Data Modeling: Data Modeling, Star Schema Modeling, Snow-Flake


Modeling,
Fact and Dimensions, Physical and Logical Data Modeling
using Erwin.

PROFESSIONAL EXPERIENCE:

CLIENT: PRESBYTERIAN HEALTHCARE SERVICES, NEW MEXICO


DURATION: AUGUST 2018 - PRESENT
ROLE: ETL DATASTAGE DEVELOPER
RESPONSIBILITIES:
 Worked for various projects for the "Centennial Care Project” which was the State's new
Medicaid Program.
 Analyzed the business requirements by dividing them into subject areas and understood the
data flow within the organization.
 Conducted one-on-one sessions with business users to gather Data Warehouse
requirements.
 Dynamic career reflecting pioneering experience and high performance in System Analysis,
design, development and implementation of Relational Database and Data Warehousing
Systems using IBM Data Stage (Info Sphere Information Server, Web Sphere, Ascential
Data Stage)
 Developed and Analyzed datamarts to prepare forecasts and identify trends within the Data
Warehouse using SQL methodologies
 Created conceptual & logical models, logical entities and defined their attributes, and
relationships between the various data objects.
 Used different Parallel job stages like Sequential File, XML stage, Dataset stage, Join, Merge,
Lookup, Filter, Col and Row generator, Transformer, Change Data Capture, Modify,
Aggregator, Remove Dup, Source and Target Stages, Oracle and ODBC Stages etc.
 Used Information Analyzer for profiling data, check the data content and structure, analyze
various source system data according to business need.
 Developed queries or stored procedures using T-SQL to be used by reports to retrieve
information from relational database and data warehouse.
 Involved in writing SQL queries, PL/SQL programming and created new packages and
procedures and modified and tuned existing procedure and queries using TOAD.
 Implemented overall best practices, tips and techniques in the design, development, testing
and deployment of DataStage jobs to the target Oracle environments.
 Executing various EDI 278 Batch jobs including FTP jobs, Request and response jobs for 278
response for certain trading partners.
 Prepared documentation for addressing the referential integrity relations in between the
tables at ETL level. Used DataStage manager to import table definitions from various
databases, import and export the DataStage jobs between development, testing and
production environments.
 Provide the staging solutions for Data Validation and Cleansing with DataStage ETL jobs.
 Responsible for all pre-ETL tasks upon which the Data Warehouse depends, including
managing and collection of various existing data sources.
 Updated existing models to integrate new functionality into an existing application.
 Created naming convention files, co-coordinated with DBA's to apply the data model
changes.
 Used forward engineering to create a Physical Data Model with DDL that best suits the
requirements from the Logical Data Model.
 Involved in setting up events for proactive monitoring. Involved in Analyzing and Optimizing
Query Performance.
 Tuned DataStage jobs for better performance by creating DataStage Hashed files for staging
the data and lookups. Used DataStage Director for running the Jobs.
 Participated in walk-through and provide approval of Test Plan and Test Cases. Participated
in defect reviews.
 Report and maintain Defects in JIRA (Defect Tracking system).
 Have a good understanding of Optimizer and Query execution plans.
 Worked on MS Azure to extract some of the source data.
 Responsible for defining the naming standards for Data Warehouse.
 Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS,
DW).
 Understood the state's electronic Medicaid eligibility verification system & the Medicaid &
Medicare welfare system intermediary along with their roles in claim processing.
 Responsible for Medicaid Claims Resolution/Reimbursement for state healthcare plan using
MMIS.

ENVIRONMENT: IBM Info sphere Data Stage (Parallel Extender), SQL, Shell Scripts, Azure, IBM Rational (Clear
Quest & Clear Case), MS SQL Server, Oracle, MS Excel, Requisite Pro, PL/SQL, T-SQL, Windows.

CLIENT: NEW CENTURY HEALTH- BREA, CA


DURATION: MAY 2017 – JULY 2018
ROLE: ETL DATAWAREHOUSE DEVELOPER
RESPONSIBILITIES:
 Worked on requirements gathering, analysis, testing, and metrics and project coordination.
Developed documents like Source to Target mapping for developing the ETL jobs.
 Populated Data Marts at different levels of granularity for Vendors using DataStage, SQL
scripts and stored procedures.
 Imported the required Metadata from heterogeneous sources at the project level.
 Involved in the deployment of DataStage jobs from Development to Production
environment.
 Developed Parallel jobs using Parallel stages like: Merge, Join, Lookup, Transformer (Parallel),
and Oracle Enterprise Stage, Dataset, Funnel, Change Data Capture and Pivot.
 Migrate the DS server jobs to Parallel jobs by using the IBM Infosphere Connector Migration
tool.
 Worked with DataStage Designer to create the table definitions for the CSV and flat files,
import the table definitions into the repository, import and export the projects, release,
and package the jobs.
 Performed debugging on these jobs using Peek stage by outputting the data to Job Log or a
stage.
 Used DataStage Designer to create the table definitions for the CSV and flat files, import the
table definitions into the repository, import and export the projects, release, and package
the jobs.
 Deployed new applications into the production environment. Supported existing and new
DataStage applications.
 Wrote complex SQL queries to enable extensive testing of ETL process.
 Generated server-side PL/SQL scripts for data manipulation and validation and materialized
views for remote instances.
 Developed job sequences to execute a set of jobs with restart ability, check points and
implemented proper failure actions.
 Utilize the IBM Datastage application to design ongoing jobs that handle large amounts of
data.
 Created xsds for file parsing for loading into destination.
 Participated in walk-through and provide approval of Test Plan and Test Cases. Participated
in defect reviews.
 Created Control-M UNIX job to schedule the jobs to be run the given time and completion of
the dependent jobs.
 Created and executed SQL queries to fetch data and compare expected results with those
obtained.
 Developed queries or stored procedures using T-SQL to be used by reports to retrieve
information from relational database and Data Warehouse.
 Worked on database connections, SQL joins, views, aggregate conditions, parsing of objects
and hierarchies.
 Extracted the data from the Oracle database transforming based on business requirements
and loading into downstream Mainframe files/Oracle database for various services to run
reports.
 Worked on creating Azure Data Factory for moving and transforming the data.
 Created and documented workflow for populating data from Staging database to Data
Warehouse.
 Create EDI Testing process, documentation, and performance matrices. Technical
Specification Creation for EDI. Schedule meetings with technical personnel to determine
technical parameters for EDI and other related processes, including communication,
security, and privacy.
 Responsible for creating the test plan and designing test cases for the EDI 834 members'
enrollment file loading Created reports that included a general overview, open bugs, new
bugs and enhancement requests. Entered new bugs and maintained the status of those
bugs.
 Gained exposure by Working as coordinator to offshore team from Onshore under ETL,
DataStage, Unix and Oracle environment.

ENVIRONMENT: DataStage 11.5/9.1/8.7, Oracle 12c/11g, Toad, MS SQL Server, PL/SQL, UNIX, Control-M,
Azure, Soap UI, Oracle SQL Developer, Jira.

CLIENT: OPTIV INC- DENVER, CO


DURATION: FEBRUARY 2015 – APRIL 2017
ROLE: DATAWAREHOUSE/DATASTAGE DEVELOPER
RESPONSIBILITIES:
 Extensively involved in the data migration team to build the Re-usable DataStage job templates,
common parameter sets, common DataStage job containers, SQL extract procedures and common re-
usable shell scripts.
 Worked with business analyst to identify, develop business requirements, transform it into technical
requirements and responsible for deliverables.
 Provide the staging solutions for Data validation with PL/SQL and DataStage ETL jobs.
 Extensively worked with DataStage 11.5 V - Designer and Director to load data from source extract files
to warehouse house.
 Designed and developed DataStage jobs for Loading Staging Data from different sources like Teradata,
Oracle, SQL Server DB into Data Warehouse applying business rules which consists data loads, data
cleansing, and data massaging.
 Used the DataStage Director and its run-time engine to schedule running the solution, testing and
debugging its components, and monitoring the resulting executable versions.
 Created Parameters and Parameter sets where necessary.
 Scheduled the server jobs using DataStage Director, which are controlled by DataStage engine and for
monitoring and performance statistics of each stage.
 Created shell script to run DataStage jobs from UNIX and then schedule this script to run DataStage jobs
through scheduling tool.
 Experience in troubleshooting of jobs and addressing production issues like performance tuning and
enhancement.
 Involved in database development by creating Oracle PL/SQL Functions, Procedures, Triggers, Packages,
Records and Collections.
 Worked with the application developer and provided necessary SQL Scripts using T-SQL.
 Create mappings and workflows to extract and load data from relational databases, flat file sources and
legacy systems using Azure.
 Involved in the DataStage Designer to develop processes for extracting, cleansing, transforming,
integrating and loading data into Data Marts.
 Created views for hiding actual tables and to eliminate the complexity of the large queries.
 Created various indexes on tables to improve the performance by eliminating the full table scans.
 Extensively involved in Unit testing, Integration testing, back-end testing and maintenance.
 Worked with experienced team members to conduct root cause analysis of issues, review new and
existing code and performed unit testing.

ENVIRONMENT: Information server - DataStage 11.5, DB2, Oracle, UNIX and Windows, PL/SQL, T-
SQL, ORACLE, MS Visio.

CLIENT: ICICI BANK- INDIA


DURATION: JANUARY 2011 – DECEMBER 2014
ROLE: DATASTAGE DEVELOPER
RESPONSIBILITIES:
 Studying the business requirement, preparing the impact analysis document.
 Prepared technical specification document, upon review of the solution developed the
solution using DataStage jobs and sequencers.
 Used sequential file stage as the source for most of the source systems.
 Developed a file check process that checks the format, volume and date in the file decides
whether the right file is being sent by the source and whether the right file is being loaded
into the database.
 We used IBM Infosphere Datastage to extract and load data into our Data Warehouse.
 Used aggregator, look up, join, merge, dataset, transformer, sequencer, sequential file DB2
bulk load, hashed file stage, surrogate key generator.
 Created DDL statements for new tables, changes to table structure, index changes, and
creation of triggers and stored procedures.
 Prepared unit test cases and test plans.
 Executed the test cases, captured the results.
 Supported the SIT testing, UAT testing.
 Worked on packaging the code with the help of tortoise SVN version controlling tool and
worked with respective teams to deploy the code.
 Supported the system post production and worked in co-ordination with the production
support teams to resolve any issues.

ENVIRONMENT: DataStage (Designer, Director, Manager, Administrator) Enterprise Edition, MS


SQL Server, SQL, PL/SQL, IBM DB2, MS Visio, Windows, MS Office, Agile, MS Access, UML, JIRA,
Oracle.

EDUCATIONAL DETAILS:
 Master’s in Computer Science from Fairleigh Dickinson University, Teaneck, New Jersey.
 Bachelor of Technology in Computer Science from JNTU, India.

You might also like