You are on page 1of 7

SKILLS EVALUATION SHEET

Candidate Name: Bellamkonda Manasa


Total Experience: 10.6 Years

Relevant Experience: 10.6 Years

Notice Period: LWD 8th March 2024

Name of Projects

No: of
Skills possessed by the months
in which the skills Description of work done using the
candidate to perform the role Mandatory / Optional worked
were used (add skill
efficiently in each
rows if necessary)
Project

Good experience in Snowflake Data


warehouse, Airflow, Matilllion ETL,
126 AWS S3, Google Cloud, Informatica
ETL SQL, PL/SQL M All Projects
Months Power center, Informatica
Developer, Oracle 11i, PL/SQL, Shell
Scripting, HTML
Communication skills of 126
M All Projects Well Experience in PL/SQL
PL/SQL Months
Migrated data from Oracle,
Relational databases like 126
M All Projects Postgres SQL ,Sql server databases
PostgreSQL M Months
& Sharepoint to Snowflake
Tableau, Power BI, or similar O Good Knowledge on Tableau
Salesforce.com platform and
O Good Knowledge on Salesforce Tool
related technologies
Primary object is to showcase POC
on migrating an ODI mapping to
48 Matillion & Snowflake.
Snowflake O
Months Worked on creating the tables in
snowflake using Oracle DDL’s.
MMI/Onsemi/EHC
Worked on setting up below items
from remote path of legacy Airflow
48
Airflow O to Cloud Astronomer using Airflow
Months
SFTP Sensor, SSH Operator, SSH
MMI/Onsemi/EHC Operator.
Spark O
Bellamkonda Manasa Ph.
No: +917406066088
Email: manasabellamkonda2212@gmail.com
CAREER OBJECTIVE

Intend to be a part of organization that offers professional growth and an opportunity to


deliver my knowledge and ideas for the growth of the organization and for professional
development.
PROFESSIONAL SUMMARY
 10.6 Years of strong experience in data warehousing, ETL &Data Quality.
 Good understanding of the Software Development Life Cycle
 Involved in end-to-end implementation of project.
 Good experience in Snowflake Data warehouse, Airflow, Matilllion ETL, AWS S3,
Google Cloud, Informatica Power center, Informatica Developer, Oracle 11i, PL/SQL, Shell
Scripting, HTML.
 Extensively interacted with the customers to understand the project requirements
completely and to provide an effective solution.
 Received Customer Appreciation mails multiple times for having provided good and on
time technical solutions.
 Involved in Design,Analysis,Development phases of project intensely.
 Worked single handedly on certain development activities.
 Continuously involved in enhancing my technical and managerial skills to have good
organizational and professional growth.
TECHNICAL SKILLS
 TOOLS : Informatica PowerCenter 9.1/9.5/10.1,Informatica Developer ,
Snowflake Datawarehouse
 PROGRAMMING : PL/SQL, Unix Shell Scripting, Python(Beginner)
 DATABASE : Oracle,SQL Server
 Scheduling Tools : Informatica scheduler ,Airflow
 AWS : S3(Cloud Object Storage)
CERTIFICATION
 Completed Google Professional Data Engineering in Jan 2021.
Certification ID: Q5rZKs

EMPLOYMENT HISTORY

 Working as Lead Data with MassMutual India from Sep 2023 to tilldate.
 Worked as Lead Data Engineer with KPI partners from Sep 2022.
 Worked as Technical Lead with Infinite Computer Solutions from Nov 2021 to Aug 2022.
 Worked as Lead Developer with Accenture from Aug 2019 to Nov 2021.
 Worked as ETL Developer with EXL Analytics from Oct 2018 to Aug 2019.
 Worked as ETL Developer with TCS from Aug 2015 to Sep 2018.
 Worked as Systems Engineer with Polaris Financial Tech Ltd from Sep 2011 to May 2014.

WORK EXPERIENCES
MassMutual India
Duration:Sep 2023 to Till date
Tools: Vertica,Airflow,Aws s3
Role : Lead Data Engineer
Team Size :7

Roles and Responsibilities:

 Managing a team of resources from vendor,onshore and offshore.


 Involved in sprint planning meetings to provide the estimates.
 Involved in successful transition of vendor onshore to offshore resources.
 Worked in setting up team by onboarding new resources and providing necessary KT.
 Worked as an individual contributor by performing Data Analysis and Data Engineering work for
critical tasks.
 Worked with product owners to understand the requirements and distributing the tasks, ensuring
the deadlines are met by team.
 Working closely with QA and reporting teams to coordinate on the tasks.
 Working on performance optimization for some of the large tables in Vertica.

KPI Partners India Pvt. Ltd:


Duration:Mar 2023 to Sep 2023
Tools: Snowflake DWH ,Matillion ETL,Python
Role : Lead Data Engineer
Team Size :1
NOV:

Roles and Responsibilities:

 Primary object is to showcase POC on migrating an ODI mapping to Matillion & Snowflake.
 Analyzed the ODI scripts and implemented the logics in Matillion.
 Worked on creating the tables in snowflake using Oracle DDL’s.
 Worked on creating & loading Dim, Facts tables and related Orchestration,Transformations jobs as per
ODI logics.
 Performed end to end UAT testing to match the counts and logics against ODI.
 Worked on tuning of queries from ODI which were running long in Snowflake.

Duration: Sep 2022 to Feb 2023

Tools: Snowflake DWH ,Matillion ETL,Python,Airflow(Scheduler)


Role : Lead Data Engineer
Team Size :6
Client: Onsemi
Roles and Responsibilities:

 Mainly focused on the Data Ingestion to extract data from different sources.
 Migrated data from Oracle, Postgres SQL ,Sql server databases & Sharepoint to Snowflake.
 Worked on setting up below items from remote path of legacy Airflow to Cloud Astronomer using
Airflow SFTP Sensor, SSH Operator, SSH Operator.
o File sensor
o File movement
o File archival
 Worked setting up file sensors, file movement & file archival for a list of files using file pattern from
remote path of legacy Airflow to Cloud Astronomer using using SSH Hook & Paramiko sftp.
 Worked on tuning of SQL queries where performance degradation is observed.

Infinite Computer Solutions:


Duration:November 2021 to Aug 2022
Tools: Snowflake DWH ,Matillion ETL,AWS S3,Python,Automate(Scheduler)
Role : Technical Lead
Team Size :4

Client: Everside Health Care:


Working for a Pharma client in onboarding members opted for insurannce into Salesforce and
ECW.

Roles and Responsibilities:


 Creating the ETL design documents by analyzing the source file layout and data.
 Load and transformation the data according to the ETL design created.
 Worked in creating the Type2, Increemental Loading designs for the merged data to load
into the current system.
 Analyzing the merger data in bringing into current system and preparing data in sync
with current system.
 Worked in designing jobs in Matillion in transforming the data.
 Worked on peer reviews for ETL design ,code.
 Used the Timetravel, Clone,Transformations,Views etc., of Snowflake features.

Accenture Services:

CITI BANK Customer Hub:


Tools : Informatica Developer ,Informatica Analyst, Oracle 12c
Role : Team Lead
Team Size :3

Objective:
Single Client Master is an initiative for bringing all the customer level information from different
products such as Banking, PL, Home Loans, Credit Cards, NR & ABF into a single repository.
Customer demographics stored in multiple systems like Bank, Cards, PL, Mortgages etc.. should
be brought into a client repository module where the applications and business could access and
get the necessary customer level information for application processing regulatory validations.
Roles and Responsibilities:
1. Understanding the requirements and involving in requirement gathering meetings.
2. Worked on standardization of fields like Country, State, Salutation using reference tables
as part of Data quality.
3. Worked on data quality checks regarding the format of the fields like PAN, Passport etc.,
4. Worked on Exception handling for the records which are failing the quality checks.
5. Worked on mapping documentation for the model upgrades.
6. Responsible in loading data to MDM BO tables.
INTIENT Research:
Tools : Informatica PowerCenter 10 ,Oracle 12c ,
Google Cloud data fusion, Big Query
Role : Developer
Team Size :4
Objective:
INTIENT Research is Accenture product for Clinical Research and Development in Life
Sciences. Here the details of Study, Area of research, patient details are maintained.
This project is implemented on Informatica PowerCenter and participated in migration to cloud.
Roles and Responsibilities:
1. Understanding the warehouse implementation present on PowerCenter.
2. Worked on doing POC on Google Cloud products.
3. Used Cloud Data fusion for implementing the scenarios which are used for current
project.
4. Used Big query in table creations and ETL using SQL using SDTM document provided
by Onshore.
5. Loading of files in testing the speed of the Intient tool.

Nova Nordisk Veeva:

Tools : Informatica PowerCenter 10 ,Oracle 12c


Role : Developer
Team Size :4
Client : NNI

Objective:
Veeva CRM is a cloud-based Customer Relationship Management application which is based on
Sales Force platform. This will be used by NNI’s field-based employees to manage their
customer data, record interactions, manage samples inventory, access interactive visual aids etc.

Roles & Responsibilities:


 Analyzing the requirements and working on impact analysis for the changes provided.
 Participating in requirement gathering meeting with Onshore.
 Implementing the data flows from Informatica to CRM and back forth for updating in the
database.
 Maintaining Inventory for workflow details and STM documents.

 Worked on performance tuning in Informatica for workflows running long hours.The


code was modified to implement best practice standards

EXL Services:
Downloaded for recruitment at SP Software (Feb 21, 2024)

Duration: Oct 2019 to Aug 2019


1) CUNA Insurance warehouse
Tools : Azure Data Factory, Snowflake DB
Role : Developer
Team Size : 10
Client : CUNA

Roles & Responsibilities:


• Analyzing requirements and creating mapping documents out of it.
• Created Azure data pipelines using COPY data activity for transforming and loading
data.
• Worked with various source formats like text files, XML and delimited files.
• Created mappings to handle scd1 and scd2 using snowflake in Azure pipelines.
• Worked in co-ordination with On-shore for the requirement gathering and supported
testing team on the defects or doubts raised.
• Given training to freshers on ETL and DWH activities.

Education

B.Tech/B.E.,Computers,2010

I hereby declare that the information furnished here is true to the best of my knowledge.
Downloaded for recruitment at SP Software (Feb 21, 2024)

You might also like