You are on page 1of 2

Pranjal Chandel

Data Engineer

Noida, India
+91-6377880378 pranjalchandel7@gmail.com

Date of birth
Profile
25th June,2000
Dynamic Data Engineer with over 2 years' expertise in ETL(Extract, Transform
&Load) processes &database operations. Proven success in creating impactful
Nationality data-driven products, notably in BFSI(Banking.Financial services & Insurance)
Indian sector.

Skills
Communication Skills Employment History
Microsoft PowerPoint
Data Engineer, LUMIQ, Noida
July 2022-Present
Docker
Proficient in PySpark, leveraged its distributed computing capabilities &rich
Python APIs to process large-scale datasets efficiently. performed batched-data
Data Build Tool(DBT) streaming.
Airflow " Proficient in Real-Time data streaming services/Message-queue system
with Apache-Kafka, used it successfully in deploying data-centric projects.
Openmetadata " Played around with REST APIs, worked & wrote APIs for seamless data
Kubernetes engineering applications purposes.
MYSQL Proficient in AWS (Amazon Web Services), specialized in leveraging its
robust infrastructure to design scalable and resilient data-driven prdocuts.
PostgreSQL With expertise in AWS services like S3,RDS,EKS, Lambda,Glue, & Redshift.
Datahub " Developed a metadata management system(Pryzm) to cater data
GraphQL governance and tracking by raising automated alerts which boosted data
accuracy by an impressive 80%.
Automation Processes
" Expertise in the hottest trend in the market, which is amalgamating AI tools
MIcrosoft Excel like ChatGPT, its APIs &other tools by giving excellent prompts.
Pyspark " Handled data models, worked with databases hosted on databricks using
both hive &the unity catgalog,.
Apache Spark
" Worked on streamlining data, ingested,transformed and delivery data in
REST APIs client projection using Snowflake.
ChatGPT " Highly-skilled and hands-on expertise in Datahub, a prominent and
in-demand tool in the Data world. Worked with Action-frameworks &custom
AI Tools
actions.
Amazon Web Services(AWS) " Worked with Graphql queries to fetch the metadata and execution details of
ETL Dataflow pipelines in Datahub.
" Slashed manual monitoring and report preparation time by 75%, optimizing
Apache Kafka operational efficiency.
Databricks " Implemented several critical level functionalities in the governance &
Data Warehousing monitoring system like Data Quality Tests, Detecting pipeline anomalies.
Snowflake " Gained exposure to Python programming which helped me immensely into
HTML/CSs developing new features and functionalities.
" Implemented a proactive incident alert mechanism, providing real-time UI
Python Flask
visibility and client email alerts for immediate issue resolution.
Python Django " Collaborated with 5+ national and international level banks including the
Linux & Shell Scripting INTERNATIONAL BANK OF VIETNAM which is a prominent name, ensuring
seamless data operations through meticulous metadata monitoring.
" Proficient in diverse databases (MYSQL, PostgreSQL, Redshift), crafting
Languages
complex data engineering queries for various data monitoring
English functionalities like Data quality, data anomalies, pipeline failures,
SLA(Service level agreement) miss.
" Adept in Docker and Kubernetes deployment, essential for streamlined
operations in various organizations.
Experienced in writing REST APIs with Python Flask &Django which has
enabled me to a rock-hard backend for client side deployments.
Developed professional skill in python specially in the most important
libraries in the field of Data-Engineering like pandas, numpy.
" Worked on data-scraping to enhance the performance of the data driven
products and to gain insights into the Data.
Skilled in data build tool (dbt) for writing complex data models and hands-on
experience with Airflow for ETL pipeline management.
Comprehensive expertise in openmetadata(an opensource metadata
management tool) and gathered a inside-out knowledge of it covering
retrieval, storage, and programmatic management of client metadata.
Hands-on experience on Jenkins for seamless CI/CD application
deployments.
" Specialized knowledge about tech products and contributed actively in data
driven products.
" Utilized RabbitMÌ for incident management within a queue system,
ensuring efficient handling of issues.
Worked on automation scripts to automate the testing process in the
projects by emulating the real time data.
" Proficient in working with Excel and powerpoint presentations.
Implemented these in live client Demos.
" Worked fair enough on writing HTML/CSS to contribute in awesome
frontends on the client side.
" Hold expertise in shell scripting and hands-on experience on linux/ubuntu
development.

Education
Bachelors Of Technology, Jaypee Institute Of Information Technology,
Noida
July 2018- July 2022
Graduated With Computer Science.

Python For Data Engineering Certification


Completed an online certification which makes me a certified Python developer
specilized in Data-Engineering field.

You might also like