Professional Documents
Culture Documents
Professional Summary:
6+ years of IT experience with Data Warehouse, System analysis, design, development, testing, Database design,
SQL.
Experienced in ETL/BI testing.
Proficiency in writing Test Plan, Test Scenarios and Test Cases using requirements, design specifications and
process flow diagrams.
Experience in identifying and resolving the bottlenecks in Source, Target, Transformations, Mappings and Sessions
for better performance.
Skilled in full life cycle development for building a Data Warehouse.
Excellent problem-solving skills with a strong technical background and good interpersonal skills. Quick learner and
excellent team player, ability to meet tight deadlines and work under pressure.
Experienced in performing Quality Test to determine the quality of the ETL processes.
Strong skills in data requirement analysis, data mapping, and debugging for ETL processes.
Experience in testing and writing SQL and PL/SQL statements
Experience with Data analysis and SQL querying with data integrations and data warehousing organizations.
Experience in creating UNIX scripts for file transfer and file manipulation
Experience in identifying and resolving the bottlenecks in Source, Target, Transformations, Mappings and Sessions
for better performance.
Strong experience in Data Analysis, Data Validation, Data Profiling, Data Verification, Data Loading.
Experience in optimizing query performance, session performance and fine tuning the mappings for optimum
performance.
Extensively worked on databases including Oracle, Teradata, SQL Server, Sybase and DB2.
Extensively involved in Onsite-Offshore model, where need to held meetings on daily basis, as were
following AGILE methodology.
Extensively worked in HP Quality Center and JIRA to perform testing and analysis.
Experienced in end-to-end testing of ETL, Reference Data, Date Warehousing and Business Intelligence solutions
for large enterprise data.
Working experience in using Informatica Workflow Manager to create and schedule workflows and Worklets.
Technical Skills:
Programming UNIX Shell Scripting, Korn Shell, C Shell, Bourne Shell, Bash, SQL,
SQL*Plus, PL/SQL, TOAD, C++
Web HTML and DHTML, XML, XSD, XSLT
Technologies
Environment UNIX, MVS, IBM AIX, Hyperion
Databases Netezza
Worked with technical designers and architects to understand the requirements for a test environment setup.
Tested the different sources such as Flat files, Legacy Flat Files, SQL server and Oracle to load into the Tera-
data data warehouse
Extensively used Informatica power center for extraction, transformation and loading process.
Tested whether the reports developed in Cognos are as per company standards.
Written, executed Test cases, and documented defects in the Quality Center.
Prepared test data by modifying the sample data in the source systems, to cover all the requirements and scenarios.
Working in Agile environment attend daily stand up, sprint planning, backlog grooming refinement and retrospective
meeting.
Ensured data integrity and verified all data modifications and calculations during database
Involved in web services testing using SOAP UI, REST API, validated request and response XML.
Did testing on different feeds coming from Customer Data Integration and Test Data.
Checking the status of ETL jobs in UNIX and tailing the log files while the loading is under processing.
Performed integration testing of Hadoop/Big Data packages for ingestion, transformation, and loading of massive
structured and unstructured data in to benchmark cube.
Involved in automation script maintenance activity
Responsible for testing and reviewing of ETL mapping and transformation specifications based on the business
requirements from various teams and the business teams.
Created Test plan, Test strategy, Critical scenarios and Test Scripts and schedule for testing.
Performed testing for reports and prepared testing document.
Worked on Cognos reports and Scheduled Daily and Monthly basis
Involved in the error checking and testing of the ETL procedures and programs Informatica session log.
Run the ETL jobs in Informatica and testing the migration data from source to target system, raise the defect in QC
for any discrepancy data.
In depth technical knowledge and understanding of Data Warehousing, Data Validations, SQL
Involving in writing SQL queries and Database Checkpoints to verify data quality and calculations, reviews
Worked on Autosys for Batch Processing ETL, PL/SQL subprograms and performed backend testing.
Worked with (Fact and Dimensional tables) OLAP (Star schema, Snowflake schema)
Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices
Solved day to day problems of the team arising due to functionality and validation issues.
Performing Defect Tracking and Performing Root Cause Analysis for issues.
Manual Execution and validation of Functional and Migrations Tests for ETL and BI processes.
Testing of ETL jobs that are scheduled for file transfers from Operational Data Stores to designated file
systems/directories.
Created Filewatcher jobs to setup the dependency between Cloud and PowerCenter job
Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the
SQL queries and also by using Parameter files.
Created data-driven automation scripts for testing API Web Services using SOAP UI, REST API.
Extensively worked in the Unix Environment using Shell Scripts.
Tested the ETL process for both before data validation and after data validation process
Used TOAD & SQL Navigator GUI tools for Querying Database. Used Toad and SQL Plus for testing execution
of ETL Processes for business rules.
Optimizing/Tuning several complex SQL queries for better performance and efficiency.
Performed backend testing by writing SQL queries and running PL/SQL scripts in TOAD.
Involved in business requirements gathering to enable Data Integration across all business group
Managed and executed the test process, using Agile Methodology
Tested various Reusable ETL Transformations which facilitate Daily, Weekly & Monthly Loading of Data.
Used various checkpoints in the Informatica designer to check the data being transformed
Tracked and reported defects into QC and notified management with details.
Involved in testing the batch jobs, using UNIX and Autosys
Environment: Informatica, Cognos, Windows, Toad, Oracle, SQL, PL/SQL, SOAP UI, REST API, Hadoop/Big Data, Hive,
UNIX, Agile, Teradata, HP Quality Center/ALM, XML.
Prepares and submit the summarized audit reports and taking corrective actions
Involved in Uploading Master and Transactional data from flat files and preparation of Test cases, Sub System Test -
ing.
Participated in defining and executing test strategies using agile methodology
Tested Informatica ETL mappings that transfer data from source systems to the Data Mart
Utilized new technologies/tools/frameworks cantered around Hadoop/Big Data and other elements in the Big Data
space.
Tested the functionality and performance of web services using SOAP UI, REST API.
Used predefined UNIX scripts for file transfer and file manipulation as part of Test Data preparation.
Involved in Test Scheduling and milestones with the dependencies
Applied Page breaks settings on Cognos reports.
Demonstrate a full understanding of the Fact/Dimension data warehouse design model, including star and snowflake
design method
Worked with development team to ensure testing issues are resolved on the basis of using defect reports.
Defects tracking, review, analyze and compare results using Quality Center.
Defined the Scope for System and Integration Testing
Verified the back-end reflection in the database by establishing connection to the database and by executing SQL
queries both in manual and automation process.
Evaluated data profiling, cleansing, data integration and extraction tools (e.g., Informatica)
Tested Analytical Data Mart (ADM) of Oracle system and Stored Procedures.
Identified & presented effort estimations related to testing efforts to project management team
Tested slides for data flow and process flows using PowerPoint and Microsoft Visio.
Used TOAD Software for querying Oracle and used Teradata SQL Assistant for querying Teradata.
Tested reports for various portfolios using universe as the main data provider.
Involved in testing the Cognos reports by writing complex SQL queries.
Worked with ETL tool to design mappings to move data from source to target database-using Informatica to
understand the functionality.
Check the naming standards, data integrity and referential integrity.
Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field
size defined within the application with metadata.
Wrote SQL and PL/SQL scripts to validate the database systems and for backend database testing.
Compatible to work in agile environment.
Responsible to verify Web services API request, response data validations in REST API, SOAP UI protocols.
Tested several UNIX Shell Scripts and for connecting to database and for file manipulation of data
Have exposure to Hadoop/Big Data testing using Hive
Tested Desktop Intelligence and Web Intelligence reports.
Identified Test scenarios for Automation for critical functionality.
Tested the Oracle PL/SQL testing programs.
Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
Environment: Informatica, Quality Center, Cognos, Agile, HP/ALM, SOAP UI, REST API, Hadoop/Big Data, Hive, UNIX,
Oracle, XML, XSLT, XML SQL, PL/SQL, Stored Procedures, Teradata
Write SQL queries to validate that actual test results match with expected results.
Helped the delivery of ETL and data integration processed to populate data into Teradata data warehouse bringing
data from the disparate marketing sources
Creating and executing SQL queries to perform Data Integrity testing on Database to validate and test data using
TOAD
Wrote PL/SQL packages, procedures and functions to implement the business logic
Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data
from source flat files and RDBMS tables to target tables
Validating maps on Informatica to verify data from Source to Target data base & record test results
Checked the reports for any naming inconsistencies and to improve user readability.
Worked with Unix for file manipulations, text processing, data parsing and Converted SQL queries results into Unix
variable
Preparing Defect Reports and Test Summary Report documents.
Worked on issues with migration from development to testing for different data loading jobs.
Compared the actual result with expected results. Validated the data by reverse engineering methodology i.e.
backward navigation from target to source.
Used Business Objects Lifecycle Manager to migrate BI resources from one environment to another.
Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
Tested UNIX shell scripts as part of the ETL process, involved in testing shell scripts used for automation of ETL
jobs
Written Test Scripts based on the Business requirements and executed Functional testing and data validation, data
reconciliation with defect correction and retesting, followed by regression and performance testing.
Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices.
Optimizing/Tuning several complex SQL queries for better performance and efficiency.
Performed Informatica ETL testing to validate end-to-end data from source system MS SQL Server and Target
system environment.
Environment: Oracle, SQL, PL/SQL, Informatica, Teradata, ALM/HP Quality Center, UNIX, Business objects, TOAD, XML
Environment: Informatica, Microsoft SQL Server, XML, XSLT, XSD, T-SQL, MS Access