You are on page 1of 2

SUNNY DHURE 9834320275

sunny.dhure0761@gmail.com
Big Data Developer

SUMMARY
To work in a challenging environment wherein I can utilize my knowledge & skills to help achieve goals and
increase profitability where, I also wish to continue learning so that I can apply my skill set for fulfilling
organizational goals and my personal objectives. .

EDUCATION PROFESSIONAL EXPERIENCE


Completed BSC from ,
Experience
Sant Gadge Baba Amravati
University
PubMatic India Pvt.Ltd, Pune
Bigdata Developer
SKILLS January 2021 To Till date .

➢ 2.5 years of experience in Big data environment as a Hadoop


•APACHE HADOOP engineer.

•APACHE SPARK ➢ Good hands on experience on Hadoop technology stack (HDFS,


MapReduce, Hive, Sqoop, HBase, Python, Scala and Spark).
•HIVE
➢ Involved in building a multi- node Hadoop Cluster Installed and
•HBASE configured Cloudera Manager and managed and analyzed Hadoop Log
Files.
•MAP-REDUCE

➢ Configured Hive Metastore to use the MySQL database to establish


•HDFS
multiple user connections to Hive tables. Imported data into HDFS
•SPARK using Sqoop.

•PYSPARK ➢ Experience in retrieving data from databases like MySQL into HDFS
using Sqoop and ingesting them into HBase.
•Spark Streaming
➢ Good experience in Python and Scala Programming Language.
•SPARK CORE
➢ Experience in transferring data from RDBMS to HDFS and Hive using
•PYTHON SQOOP.

•Kafka
➢ Experience in creating tables, Partitioning, Bucketing, Loading and
aggregating data using Hive.
•SQL

•UNIX ➢ Experience in using Spark SQL with various data sources like CSV,
JSON and HIVE.
•AWS
➢ Experience in HDFS Architecture and cluster concepts.
Experience of working on Agile methodology and JIRA.
PROJECT :
Project 1 :-
Domain worked on :- Banking domain
Technologies used :- Hadoop, Hive, SQL, Hbase,
Sqoop and Cloudera.
Roles and Responsiblities :

❖WRITING THE SCRIPT FILES FOR PROCESSING DATA AND


LOADING TO HDFS.
❖LOADING FILE TO HDFS AND WRITING HIVE QUERIES TO
PROCESS REQUIRED DATA.
❖SET UP HIVE WITH SQL AS A REMOTE METASTORE.
❖USED SQOOP TO PULL DATA AND CREATED HIVE TABLES ON
DATA IN RAW LAYER.
❖INVOLVED IN GATHERING THE REQUIREMENTS, DESIGNING,
DEVELOPMENT AND TESTING.
❖INVOLVED IN EXTENSION OF PIPELINES LINE TO HBASE FOR
DASHBOARD.

Project 2 :-
Domain Worked on :- Retail Domain
Technologies used :- Spark, Hive, SQL, Hbase , Pyspark

Roles and Responsiblities

❖ SOURCES FOR THIS PROJECT WERE MYSQL DATABASE,


WHICH GETS COMBINED WITH FILE SOURCES.
❖ WHICH NEED TO COMBINE BY USING ETL OPERATIONS.
❖ COMPLETELY INVOLVED IN THE REQUIREMENT
ANALYSIS PHASE.
❖ INVOLVED IN PYSPARK PROGRAMMING FOR
PREPROCESSING AND CLEANING OF DATA FOR NEXT
STAGE OF ANALYTICS.
❖ THIS TABLES WERE THEN TRANSFERRED INTO RAW TO
PROCESSED TO CURATED FORMS VIA
PYTHON/SPARK/SQL SCRIPTS, WHICH LATER LOADED
INTO HIVE.
❖ PERFORMED BULK LOAD FROM HIVE TO HBASE.

PERSONAL DETAILS

Date of birth :- 29/10/1999


Nationality :- Indian
Gender :- Male
Marital status :- Single
Languages known :- English, Hindi, Marathi

DECLARATION
I hereby declare that the facts given above are genuine to the best of my
knowledge and belief.

You might also like