You are on page 1of 2

Bhavnesh Baghel Data Engineer

+917048974879 Current location - Agra, U.P

bhavnesh1322@gmail.com 03/13/2000

bhavnesh-baghel Bhavnesh Baghel

Summary
More than one year experience working in the data field, currently working as a data engineer. I have skills
in Python to build ETL pipeline and also have good analytical skills to manupulate data.

Work Experience
Data Engineer at KycHub 12/2022 – 08/2023
Functions : Bengaluru, India
Building an end-to-end pipeline for ETL

I efficiently used Airflow to simplify and manage workflow processes.


Selenium and BeautifulSoup4 to automate data extraction from websites.


Developed Python-based APIs to enhance automation and facilitate robust


testing.
I gained an understanding of Spark and Kafka which contributed to data

processing and time streaming capabilities.

Data Science Intern at iNeuron.ai 11/2021 – 06/2022


Functions : Bengaluru, India
Preparing raw data for manipulation by data scientists.

Works closely with a team of frontend and backend engineers, product


managers, and analysts.


Performs data analysis required to troubleshoot data related issues and assist in

the resolution of data issues.


Using Machine learning algorithms to find Accuracy of the data.

Education
B.Tech in Computer Science and Engineering 2018 – 2022
Faculty of Engineering and Technology, Agra college Agra, India

Intermediate and Highschool in PCM 2016 – 2018


M.D jain Inter College, Agra Agra, India

Skills
Python ( Selenium,bs4,Flask,Django, Data Modeling, Debugging)
Experience in writing SQL query for engine.
Machine learning ( Supervised )
Web technologies ( html5, CSS3, JavaScript)
Knowledge of APIs, GitHub, CI/CD , Docker and YAML.
Data Warehousing ( Hive )
Languages
Hindi English

Interests
Football Photography Chess
problem-solving

Certificates
Full stack Data science course from iNeuron.ai Master SQL for Data science
Responsive web design certification

Projects
2. Fincai
Project Overview;
I worked on a project focused on creating a designed ETL pipeline. This pipeline includes APIs for extraction,
transformation and loading stages. We have carefully structured the directory layout to ensure automation
using Airflow. In addition, to that we have incorporated cleaners crawling scripts and other elements to
make the project comprehensive.
Technological Landscape;
For this project, we have utilized Python as our programming language of choice along with the Flask
framework. To enhance performance we have taken advantage of the capabilities offered by AWS cloud
services. As, for our database needs we have opted for Postgresql which provides data management and
easy accessibility.

1. SOS - Accident Detection


Project Overview:
This innovation can detect accidents as they happen, springing into action by automatically sending out an
alert with the exact location on Google Maps and a distress message to nearby hospitals. These hospitals
can then respond swiftly by sending ambulances to the scene, equipped with a Google Maps link that guides
them straight to the spot.
Technological Magic:
Python is the language that weaves this magic. Django, the framework that makes building easy, adds
structure to the project. Storing all this information is SQLite3, a friendly database that helps everything stay
organized. To make sure our system talks to others, we've included APIs. But the real hero here is YOLOv5, a
clever model backed by deep learning. We trained this model on our very own dataset to make the system
recognize things with incredible accuracy.

3. Review Scrapper
Project Objective:
At the heart of this project lies a simple yet powerful goal: to retrieve and isolate product reviews along with
their ratings from the vast landscape of the Flipkart website.
Technological Approach:
The language of choice is Python, a versatile tool that empowers our project. The Flask framework adds a
touch of elegance to the structure. The art of web scraping is orchestrated using the Beautiful Soup library,
allowing us to traverse the web's complex tapestry and extract the information we need.

You might also like