Professional Documents
Culture Documents
Summary
I having Total 9+ years of experience in IT and 6 years of experience in Hadoop Technologies in Big Data
tools, Python, Spark, Oozie GCP(basics).
Worked on Data Lake, Data Pipelines using CDAP, Spark.
Knowledge of Bigdata Hadoop.
Extensively worked on Spark and Hive Technologies across different project life cycles - Design,
Development, Defect Analysis, Data Analysis, Data Mapping. Looking for a challenging role in Big Data
space and Cloud Migration.
Experienced in Agile Methodology.
Education
Year Of Percentage
S. No. Course School / College
Passing of Marks
M.C.A. (Master of Paventhar Bharathidasan College of
1 Computer Engineering & Technology, Anna 2012 7.5 (GPA)
Application) University
B.S.C ( Bachelor of E.G.S.Pillay Arts & Science College,
2 2009 6.3 (GsPA)
Science) Bharathidasan University, Nagapattinam
2 HSC St Anthony’s Higher Secondary School 2006 55
3 SSLC St. Anthony’s Higher Secondary School 2004 69
Technical Skills
Hadoop (HDFS),
Sqoop,
Spark,
Hive
Oozie
Linux
GCP
Experience:
The Project Involves migrating on the on-premises bigdata ecosystem to CDP private cloud, which included setting
up new framework, creating datalake , migrating to new framework .
Responsibilities:
Sensitivity: Confidential - Not for you? Notify the sender and delete. See more on https://www.proximus.com/respect-confidentiality
Raju Pandi.
Mobile Number: (+91) 8144568078, (+32) 467767363
Email: raju.pandi@tcs.com
Environment Scala, Python, HDFS, Hive, SQOOP, Shell Script, Spark SQL, Oozie.
Developing end-to-end pipelines to ingest the data from various source into google cloud through Hadoop ecosystem
using apache spark with python and other GCP
· tools(BigQuery,Data fusion,Dataproc)
· Developing the data pipelines using data fusion and CDAP.
Developing the data pipelines for the third parties to consume the data from the big data.
Responsibilities:
Tools(BigQuery,Data fusion,Dataproc)
Developing the data pipelines using data fusion and CDAP.
Developing the data pipelines for the third parties to consume the data from the big data.
Took leadership for Bridge File Collector Development using CDAP Pipelines by creating a Template
Pipeline which can be used across all the File based Applications and has cut development timeline required
from 1 day to 1 hour per pipeline.
Received Appreciations from VF Customer for delivering Pipelines for SIT/E2E on time.
Responsible for supporting SIT/E2E Testing with VF Team by running respective pipelines for File
movement and addressed Errors/Issues on time.
Big query analysing data and applying the transformation extraction data convert into files and stored the
GCP location.
Responsible for providing Config Information to support Collibra DMaaP Integration.
Worked on enabling File to File & DB to File Mapping scripts for DMaaP File Collector Ph1 Project.
Involved in delivering File collectors at Timelines agreed with VF Customer.
Sensitivity: Confidential - Not for you? Notify the sender and delete. See more on https://www.proximus.com/respect-confidentiality
Raju Pandi.
Mobile Number: (+91) 8144568078, (+32) 467767363
Email: raju.pandi@tcs.com
UNE EPM Telecommunications doing business as Tigo UNE, is a Colombian telecommunications company created
in 2006, owned by Group EPM and Millicom International Cellular, S.A. with 50% each. The company offered
telecommunications services nationwide under the Tigo UNE brand, international under the Tigo brand and under the
Orbitel brand in Canada, the United States and Spain (with long distance services and mobile telephony with Orbital
Mobile).
Responsibilities:
Processing the historic data of Tigo UNE from Oracle 11G using SQOOP to raw zone as a hive table.
Provided solution using Hadoop ecosystem – HDFS,,pyspark, MapReduce, SQOOP, Hive, Zaloni,
BEDROCK,Cloudera.
Extracted data from Oracle 11G and files from Tigo UNE customers through Sqoop and placed in HDFS and
processed further and applied the Business logic in Application Zone.
Further reconciliation audit to be performed in Hive queries(if direct-mapping) and the result will be
maintained in HDFS.
DataLake been designed in various stages based on type of data.
Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map
reduce way. Importing and exporting data in HDFS and Hive using Sqoop.
Familiarized with job scheduling using BEDROCK so that CPU time is well distributed among all the jobs.
Involving in Business Requirements clarification by coordination with Business users.
American based corporation providing financial, property and consumer information, analytics and business
intelligence, the company analyses
information assets and data provide client with analytics and customized data services.
EDG and ADC:
There are three stages in the module pre-processing reconciliation and post processing. Pre-processing automatically
fetch import files from FTP which was submitted by end user. Extract data using the KOFAX OCR Keying
processing later reconcile it and perform validation steps Business validation done business quality check and then
file send automatically to the output path. Post processing fetch the output and convert specific format along with
addition with automated file and send back to the CoreLogic as the 3072 bytes output file.
Responsibility:
Sensitivity: Confidential - Not for you? Notify the sender and delete. See more on https://www.proximus.com/respect-confidentiality
Raju Pandi.
Mobile Number: (+91) 8144568078, (+32) 467767363
Email: raju.pandi@tcs.com
Solar-Panel Campaign is a web-based application developed for the Bpo process. The main aim of the
campaign is to increase the sales of the Solar-Panel in US. There may be Lot of Chances that us citizens
won’t understand our native English. To solve this problem, we implemented the Avatar based voice process
where the necessary conversations are stored in prompts. The customer executive can listen to the voice of
the client and communicate through the voice that is stored in the prompt. The prompts can be expanded as
per the Client needs. Here our system is integrated with the Dialler responsible for handling Inbound and
Outbound Calls.
OS Exploited
Personal Details
Name : Raju Pandi.
Date of Birth : 25th July 1987.
Sensitivity: Confidential - Not for you? Notify the sender and delete. See more on https://www.proximus.com/respect-confidentiality
Raju Pandi.
Mobile Number: (+91) 8144568078, (+32) 467767363
Email: raju.pandi@tcs.com
Sex : male.
Marital Status : Married.
Nationality : Indian.
Languages : Tamil & English.
Hobbies : Playing Cricket.
Present Address : No 11, Marunthu Kothala Road,
Nagapattinam, Tamil Nā du,
Nagapattinam – 611001.
I hereby declare that the above-mentioned information is true to the best of my knowledge.
Yours truly,
RajuPandi.
Sensitivity: Confidential - Not for you? Notify the sender and delete. See more on https://www.proximus.com/respect-confidentiality