Senior ETL Developer PROFESSIONAL SUMMARY

:

Over 7+ years of total IT experience and technical proficiency in building Data Warehouses, Data Marts, Data Integration, Operational Data Stores and ETL processes for clients in Financial (Equities, Futures, Options, Commodities, SPOT’s, Swaps, Bonds, Credit Risk, Market Risk, Operational Risk) and HealthCare (Providers, Customers, Organizations, Plans, Claims, and Extracts) domains. 5+ years of strong experience in working with large scale Data Warehouse implementations using Informatica PowerCenter 8.x/7.x/6.x, Oracle, DB2, SQL Server on UNIX and Windows platforms. Strong knowledge in OLAP systems, Kimball, and Inmon methodology & models, Dimensional modeling using Star and Snowflake schema. Extensive experience in Extraction, Transformation, and Loading (ETL) data from various data sources into Data Warehouse and Data Marts using Informatica PowerCenter tools (Repository Manager, Designer, Workflow Manager, Workflow Monitor, and Informatica Administration Console). Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions and Workflows using Informatica PowerCenter. Experience in performance tuning of Informatica mappings and sessions to improve performance of the large volume projects. Experience in Migration, Configuration and Administration of Informatica PowerCenter. Experience in integration of various data sources like Oracle, DB2, SQL Server, Flat Files, Mainframes, XML files into Data Warehouse and also experienced in Data Cleansing and Data Analysis. Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages, Cursors, Triggers, Views, and Indexes in distributed environment. Excellent expertise with different types of data load strategies and scenarios like Historical Dimensions, Surrogate keys, Summary facts etc., Worked extensively in all stages of SDLC, from gathering requirements to testing, implementation and support. Experience in preparing documentation such as High level design, System requirement document, and Technical Specification document etc.,

 

asset management.1/7. Control-M. Excellent analytical. banking. Worked with cross-functional teams such as QA. C++. performance bond.6. XML.x/9. is the world’s leading and most diverse derivatives marketplace.01.1/8. and Business Objects. MicroStrategy 8. error handling.5. problem solving skills with strong technical background and interpersonal skills.1.x BI Reporting Tools Business Objects. Java. OBIEE. position management. IBM MQ Series Test Management Tools IBM Clear Quest. and ControlM scheduling tools to organize and schedule jobs. Experience in project management. Good knowledge on generating various complex reports using OBIEE. DB2 V8. COBOL. Chicago IL Jul ’09 – Present Data Warehouse Consultant Project: Clearing Positions Confidential. and auditing purposes. Windows XP/2000 Databases Oracle 11g/10g/9i.2 Programming Languages C. Autosys Application Servers Web Logic 10.x. The main objective of the project is to build a distributed environment which would be a primary source for trade processing. JSP Scripting Languages Shell Scripting. PL/ SQL. SQL Server 2008/2005. automation of ETL process. HTML. Roles & Responsibilities: . and resource management activities. settlement.1/8. Good knowledge on TIBCO Rendezvous and IBM MQSeries. estimations. Quality Center. SQL Navigator Load Utilities SQL Loader ETL Tool Informatica PowerCenter 8. and deliverables information. Perl Scripting Tools UC4.x/6. Experience in using IBM Clear Quest to track defects and document test cases. SQL Scripts for development. Experience in using UC4. Autosys. J2EE. MicroStrategy.       TECHNICAL SKILL SET: Operating Systems UNIX. Strong experience in writing UNIX Shell scripts. DBA and Environment teams to deploy code from development to QA and Production server. Tomcat Middleware TIBCO Rendezvous. MS Access Database Tools TOAD. JIRA PROFESSIONAL EXPERIENCE: Confidential. Linux. Cron.

                 . and functions. Documenting the business requirements and framing the business logic for the ETL process. Design.   Interacting with business owners to gather both functional and technical requirements. workflows. tables. Developer Shell/Perl scripts to transfer files using FTP. and analyze liquidity generating performance of Market Maker firms trading CME Group’s products. SFTP. Involved in creating logical and physical data models using CA ERwin data modeler. Migrating historical data from DB2 to the Oracle data warehouse. SQL queries. conversion.6. Load historical and intraday trades. monitor. views.1 within the timeframe. worklets. and product data into Oracle data warehouse to enable business analysts to better understand. Used Business Objects XI R2 to programmatically generate reports and gather necessary information about report instances. reporting. Use Agile methodology for SDLC and utilize scrum meetings for creative and productive work. and Shell scripts to implement complex business rules. and test Informatica mappings. Developing technical specifications and other helpful ETL documents following CME Group’s standards. Upgraded Informatica repository to 8. DB2 into Oracle data warehouse. Design and develop PL/SQL packages. Transferring the data from various sources like XML. settlements. workload automation and for generating reports. indexes. and transformations to load Oracle 10G based Data Warehouse. and to automate ETL jobs. reusable objects. Performed/automated many ETL related tasks including data cleansing. implement best practices to maintain optimal performance. Work with DBA’s and systems support personnel in elevating and automating successful code to production. Oversaw unit and system tests and assisted users with acceptance testing. Generating the DDL scripts for the physical data model. Experience using Web Logic for hosting the servers. Identifying bottlenecks/issues and fine tuning them for optimal performance. Responsible for capturing. and correcting error data. develop. stored procedure. Extensively worked on SCD type 2 using Look up transformation. flat files. positions. Used UC4 for job scheduling.

1. Shell/Perl Scripting. Sequence Generator. Roles & Responsibilities:  Interacted with Business Users to gather business requirements and designed user friendly templates to communicate any further enhancements needs to be implemented. RHEL 5. reusable. Created System Interface Agreement (SIA) between source system and target systems. IBM Clear Case & Clear Quest 7. Windows XP. Lookup. Wilmington DE Mar ’08 – Jun’09 Informatica Consultant The objective of the project is to provide timely. Experienced in developing mappings using transformations such as Source Qualifier. Designed ETL specifications with transformation rules using ETL best practices for good performance. TOAD 9. Aggregator.        . Router. This means providing the ability to measure accurate sales results and identifies growth potentials. Confidential. Oracle 11g/10g RAC. which has the escalation procedures in case of issues and SLA. Involved in documenting Functional Specifications. business and organizational issues and providing a more complete picture of sales. Conducting code walkthroughs and review peer code and documentation. Coordinated with Data Modelers for designing the dimensional model. TIBCO.4. This involves developing and implementing a Solution that will address the technology challenges.0. Playing role in design of scalable.1/8. Extensively worked in Credit Cards billing and payments subject area.3.5. The current technology solution proposed is to implement a data warehouse which would be the primary source for all business reporting. DB2 v8.01.4. Environment: Informatica Power Center 8. UC4. accurate and consistent sales information. Data is then extracted from the ODS and transformed into various data marts such as Spending Report (SPR) data mart which is used to track total spending of different groups/departments. maintainability of the code and efficient restart ability. SQL Developer. and low maintenance ETL templates. Rank. BO XI. Designed reusable objects like mapplets & re-usable transformations in Informatica. Design Specifications documents and created ETL Specifications documents and updated them as and when needed. Web Logic 10.0. PL/SQL. Expression.6. Filter.   Provide on-call support to production system to resolve any issues.1. Update Strategy. The data is extracted from several source systems and consolidates it into an enterprise data store known as Operational Data Store (ODS). ERwin. SQL.

SQL Server 2008/2005.XML SQ/Parser/Generator.7.       Environment: Informatica Power Center 8. COBOL files to the target Data Warehouse. Responsibilities:  The COMSORTer application was Oracle based and the existing data was stored on SQL Server and DB2. Worked closely with Business Intelligence (BI) team and assisted them to develop reports using Business Objects reporting tool. TOAD. Erwin. IBM MQSeries. Confidential. etc. Sun Solaris 2. to load data from different sources like Oracle. Excel Spread Sheets. Normalizer. COBOL. Tomcat. Conducted code reviews to make sure the business requirements are met and the coding standards are followed. Migrated the data from SQL Server and DB2 to Oracle.. PL/SQL. Erwin.    Experience in implementing Type II changes in Slowly Changing Dimension Tables.1/7. Autosys. Comsort creates and sends surveys to physicians to nominate specialists in their respective fields which Merck’s sales and marketing team uses to target for promoting drugs and to invite them as speakers at physician conferences where they can promote Merck’s drugs. Tested and tuned the SQL queries for better performance. Performed data validation in the target tables using complex SQLs to make sure all the modules are integrated properly. Maryland Mar’06 – Feb’08 Informatica Consultant COMSORTer Application: Comsort Inc. Shell Scripting. Flat Files. is a wholly owned subsidiary of Merck Inc. Designed and developed the UNIX shell scripts for the automation of ETL jobs. Coordinated with System support team to setup the system test environment for code migration and code execution process in QA environment. Experience in working with different 3rd party data cleansing tools like Trillium. Identified the bottlenecks in mapping logic and resolved performance issues. BO XI. XML. The project was about creating an online application that Comsort could use to enter the survey data and to generate lists of physicians based on the surveys and the nominations provided.  . Involved in cleansing raw data in staging area using stored procedures in pre and post-session routines. one of the world’s largest pharmaceutical companies. Trillium. Windows XP.1.

Lookup. The Event Wait task would wait for the indicator file which was being dropped onto the Informatica server by the Cold Fusion (Front End). Created DTS packages to generate flat files from the views created. Joiner. and to verify data in target tables. Involved in the smooth transition from Informatica 7. workflows. Transferred the data from a combination of different input files like XML. Created different transformations using Informatica for loading the data into SQL Server database. Expression. Extensively worked on database performance tuning techniques and modifying the complex join statements. reviewed. Extensively worked with DBA’s during the performance testing phase for our database. Reducing Unnecessary Caches etc. Designed mappings to load first the Staging tables and then the destination tables. optimized. Flat files to Oracle. Projects and other tables related to Surveys.1 to Informatica 8. and then transfer the control to the rest of the workflow to load the data. Router etc.. Used existing UNIX scripts and modified them to load the Oracle tables.0.              . repositories into the new environment. Worked as an Informatica Administrator to migrate the mappings. Generated SQL Loader scripts and Shell scripts for automated daily load processes. Designed mappings to load the Surveys.      Created a mapping document that outlines the sources mapped to the targets Created a document outlining the plan of action to be taken for the entire process Created views to select data from the existing SQL Server databases. Created. Developed triggers and stored procedures for data verification and processing. Identifying Bottlenecks. Filter. Optimizing SQL. Creating workflows with the Event Wait task to specify when the workflow should load the tables. Created Functional Spec & Technical Spec documentation & also documented the issues found in the end to end testing. and executed Complex SQL queries to validate transformation rules used in source to target mappings/source views. Designing mappings using transformations such as Source Qualifier. Questions. sessions.

2. TOAD 7.  Configured and Administered Informatica Servers. and Embedded real time software.1. Created views and designed mappings to load test for the UAT to test the application. Created various geographical and time dimension reports. analyze the system.6. We have expertise in product design and development for embedded systems software and system integration. Provided production support for Business Users and documented problems and solutions for running the workflow. T-SQL. UNIX AIX4. Oracle 10g/9i. involved in estimation and detailed scheduling of various modules. MicroStartegy 8. provide suggestions.1/8.0/7. Use Agile methodology in design and development of the application. Hyderabad India May’04 – Feb’06 Developer Ceeyes is a leading provider of intellectual property software cores in the acres such as Layer2/7 Networking. Project: Annual Maintenance Contract (AMC) Roles & Responsibilities:  Identifying functional requirements. Confidential. Documented detailed steps for migrating the code. tuning and periodically refreshing the test databases from the production databases. and design as per requirements and test the design. Control-M. Windows XP/2000. Coordinated with team members in analyzing the business requirements.1. DB2 UDBv8. PL/SQL. KT the entire process to the production support members. Supporting the application in Production environment by monitoring the ETL process everyday during the nightly loads. Shell Scripts. Erwin. Developed conceptual design document with prototyping of UI.    . Designed and developed scripts for administrative tasks like backup’s. SQL Loader. SQL Server 2000. Wireless.         Environment: Informatica Power Center 8. Moving the mappings and workflows from Dev to QA and QA to Production environment and unit testing the process at every level. Extensively used Debugger Process to test data & applied Break Points.

J2EE (JSP. scripts. Servlets). Educational Qualification: Bachelor Degree in Computer Science & Information Technology . UNIX.    Environment: PL/SQL. Created generic packages useful for other team members. SQL. and packages for applying the business rules. Identifying database requirements and was involved in designing of database for various modules. Created stored procedures. XML. Performance tuning and optimization achieved through the management of indices. table partitioning. Windows 2000. Oracle 9i. and optimizing the SQL scripts. functions.

Sign up to vote on this title
UsefulNot useful