You are on page 1of 2

SHIVANI BABASAHEB SHINDE

shivanishinde1301@gmail.com | +919309857297 | linkedin.com/in/shivanishinde130199

A team player with a motivated attitude, ready to contribute effectively to data warehousing projects.
● Over 3 years experience, focused on Data Warehousing technology, specializing in ETL tool IICS (IDMC).
● Developed ETL mappings, mapplets, and workflows using IICS CDI.
● Integrated data from diverse sources including Oracle, Snowflake, Flat Files, and CSV files.
● Implemented complex business rules through mappings and reusable transformations.
● Expertise in various Informatica transformations like Lookup, Expression, Aggregator,Sorter,Router etc.
● Implemented SCD Type 1 and Type 2 mechanisms, dynamic and unconnected lookups.\
● Utilized Informatica reusability through reusable transformations and mapplets.
● Proficient in incremental loading and parameterization.
● Conducted unit testing for mappings using sample data.
● Experience in working with Oracle databases and flat files.
● Strong analytical and problem-solving skills with a quick learning ability.

SKILLS

Languages: Python, C++, SQL


Test Management Tools: HP ALM, Jira
Software: Databricks,MS Excel, MS Word, Snowflake
Other: Data Visualization, Machine Learning, Data Analytics
Cloud: IICS CDI

EDUCATION

MIT World Peace University, Pune Jul ‘17 - Jun ‘21


B.Tech in Computer Science
Relevant coursework: Big Data Analysis, Introduction to Machine Learning and Algorithms,Data Structures and
Algorithms, Object Oriented Programming

WORK EXPERIENCE

Cognizant Aug ‘21 - Present


Programmer Analyst, Project related to Life Sciences Domain

● Created ETL mappings using Informatica PowerCenter to move Data from multiple sources like Flat files,
Oracle into a common target area such as Staging, Data Warehouse and Data Marts.
● Designed the Mapping Technical Specifications based on Functional Requirements.
● Developed different mappings by using different Transformations like Aggregator, Lookup, Expression, update
Strategy, Joiner, Router etc. to load the data into staging tables and then to target.
● Extensively worked on Mapping Variables, Mapping Parameters.
● Implemented effective date range mapping (Slowly Changing dimension type 2) methodology to load Dimension
(History) table.
● Used SQL tools like Oracle SQL Developer to run SQL queries and validate the data loaded into the target
tables.
● Review the documentation and coding created by other team members.
● Verifying the data in target database after ETL process.
● Created Unit Test Documents.

Associate, Project for American based Retail Banking Company


Tools and Skills- IICS, SQL Developer, Snowflake.
Database- Snowflake
Role- ETL Developer
Project Description-The project aims to migrate an on-premise Data Warehouse to a Cloud-based solution. It involves
integrating multiple systems, including Oracle, SQL Server, and files, to push data into Snowflake for reporting and
analytics.

● Collaborated with IT architects and program managers to gather requirements, analyze needs, and coordinate
projects.
● Evaluated existing ETL Data Warehouse processes and devised design specifications for new targets.
● Established ETL and Data Warehouse standards, including naming conventions, methodologies, and data cleansing
strategies.
● Documented detailed mapping from source to target transformations and data column information.
● Designed, developed, and implemented ETL processes using IICS Data Integration.
● Utilized performance tuning techniques for efficient data loading into Snowflake via IICS.
● Applied various cloud transformations (e.g., Aggregator, Filter, Joiner) and connectors (Snowflake, Oracle, SQL
Server) extensively.
● Created cloud integration templates for parameterized mapping to facilitate Stage, Dimension, and Fact load
processes, including SCD Type1, SCD Type2, and Incremental Load.

PROJECTS

Smart Water Supply Using Internet of Things (Publication Link) Aug ‘19 - Oct ‘19
● Led a team of 4 and built a water supply system aiming to limit the flow of water received by each household on a
daily basis based on the per capita usage
● Conducted statistical analysis of data from ultrasonic and turbidity sensors to maintain the basic quality features of
water and keep track of its quantity and quality
● Stored the relevant data required to analyze the current methods of evaluating the state of household water insecurity
and identified the missing aspect of per capita requirement of the people

Surveillance system using Intelligent Image Caption Generator


● Developed a surveillance system using Intelligent Image Caption Generator and Deep Learning techniques to detect
malicious activity with minimal human intervention
● Utilized Python, OpenCV, and Convolutional Neural Network (CNN) to break down videos into frames and analyze
them to identify suspicious activities
● Implemented Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM) to generate descriptive
captions for each frame
● Configured Twilio app to send alerts to the user if any abnormal behavior was detected, along with the relevant
image
● Deployed the application on AWS EC2 to ensure seamless performance and scalability

You might also like