You are on page 1of 3

Siva

Email:-bckrbckr61@gmail.com
Ph: -+91 9948996540

TALEND Big Data PROFESSIONAL


A software professional having 4+ year of proven IT experience in Data Integration using Talend Studio (6.1/5.x) & Talend
Admin Center (TAC).Excellent exposure to Talend. Well acquainted with software development life cycle .

SUMMARY
 Excellent experience in ETL tool TALEND (Data Integration, Data Quality, TAC) perspectives
 Proficient in TAC (Talend Administration Center)
 Involved in resolving various performance issues in the Talend jobs.
 Good software design abilities including Design Principles and Patterns as well as module level design.
 Well experienced in Agile methodology for software development (Scrum) life cycle
 Clear understanding of Software Development Life Cycles
 Having very good hands on experience on various user training activities onsite as well as offshore
 Experienced in developing technical documentation, user training manuals and publishing knowledge articles.

Area of Expertise
 Strong working knowledge on Software development life cycle.
 Have Good knowledge on Requirement analysis.
 Extensive experience in coordinating project team on deliverables, status reporting to management, issue escalations.
 Understanding the Business requirements, Functional specifications.

CAREER OBJECTIVE

To work in the most challenging position in an organization by utilizing abilities developed through hard work, determination
and perseverance with an approach reflecting enthusiasm and positive attitude towards achieving my goals.

TECHNICAL SKILLS

ETL Tools Talend 6.x,7.x (DI, DQ, MDM, BD)

Administration Tool TAC (Talend Administration Center)

Databases Oracle

Database Programming PL/SQL

Big data Hive, Basic Hadoop

Tools / IDE Talend Studio (7.2.1/5.x), TOAD

Splunk Data ingestion in AWS,Dashboard

1 of 4
ACADEMIC DETAILS
 B-Tech in Electrical and Electronics Engineering under JNTUA, Andhra Pradesh.

EMPLOYMENT HISTORY
 Capgemini Technologies Services India Limited, Bangalore- software engineer- Sep 2018 to Sep 2021.
 Tata Consultancy Services, Hyderabad – System Engineer – Sep 2021 to till date.

PROJECT#1(Current):

Project Name: Project Execution


Client: GE POWER
Team Size: 12
Role: ETL Developer (Talend, PostgreSQL)

 Worked as ETL developer by extracting, transforming and loading (ETL) data from multiple data sources, file source
to data warehouses and data marts using Talend tool while working extensively with slowly changing dimensions.
Handled couple of engagements of various capacities as ETL designer and consultant of Talend.

 Recently Worked on one of the migration projects where Greenplum datalike is get migrated to Aurora AWS platform.
In this project worked upon SQL tuning and query optimization part

 Currently working on Talend development project whereas designing ETL talend jobs to migrate data from Data
Sources to client application AWS database. Creating PostgreSQL Queries to provide the relevant data to the user for
Tableau Dashboards.

 Worked on 2 projects in TCS till now 1. Aurora Migration project 2. EHS&QUALITY-- Talend & Data Engineer.

PROJECT#2:

Project Name: Transforming Enterprise Reporting and Analytics (TERA)


Client: HPI
Team Size: 14
Role: ETL Developer (Talend, Hive, SQL, Hadoop, Azure)

HP-TERA is a cloud based project where we move HP related data to cloud platform. We use data processing
components like Hadoop, Hive, Spark, Talend, and Azure. There are 3 different layers in the project: Ingestion, Distillation, and
Consumption. Ingestion gets data from different sources which is processed using Hive and is it moved to distillation. Using
Talend, Distillation loads the data into different tables and drops the files to ADLS (Gen2). Then using ADF, Consumption
triggers pipelines from ADLS to load the data and develop Dashboards, Models as per the Business requirements.

Responsibilities:
·   Involved in gathering the requirements from the business.
. Requirement is to Archive all the Deep History data from EDW to Azure Data Lake Storage Gen2 Archive Zone.
·   To Encrypt the data while loading from Reporting Zone to Target path.
·   PII data should be masked.
·   Completed the end to end implementation with a proper unit testing.
·   Involved in creating Talend template job to extract the data from various sources like Database and files.
·   Created various layers of jobs like Ingestion, Distillation and Extraction to pull the data from source and place in ADL
. Worked with Type II slowly changing dimension.

2 of 4
PROJECT#3:

Project Name: Data Ingestion Framework (WALT DISNEY)


Client: Lake Buena Vista, Florida, US
Team Size: 25
Role: ETL Developer (TALEND)

The mission of the Walt Disney Company is to entertain, inform and inspire people around the globe through the
power of unparalleled storytelling, reflecting the iconic brands, creative minds and innovative technologies that make ours the
world's premier entertainment company. Their goal is to benefit their guests, employees and businesses, while making
the Company a desirable place to work through their consumer social responsibility efforts.

Responsibilities:
 Requirements gathering &analysis, estimating the effort and delivery timelines.
 Preparing the database design and implementing the design.
 Implementing the ETL design and Development.
 Perform data integration with a goal of moving more data more effectively, efficiently and with high performance.
 Responsible for fetching the data from various source systems.
 Integrate the other team member jobs to create master job.
 Analyzed the sources, targets, transformed the data, mapped the data and loading the data into targets using DI jobs.
 Tuning bad queries by analyzing explain plans, visual explain to understand optimizer plans.
 Responsible for developing various mappings, Transformations and for the migration of data using TALEND jobs.
 Scheduled jobs involving Shell scripts in Unix Environment.
 Responsible for the Logging Issues and onsite/offshore Coordination.
 Used FTP to move data from one server to another server.
 Responsible for doing the code reviews of jobs, doing the unit testing, load and validate the data.
 Delivering the weekly status report, Daily Status Report
 Used Talend reusable components like routines, context variable.
 Analyzing the source data to know the quality of data.
 Providing day to day updates and reports to the Program/Product Managers.

3 of 4

You might also like