You are on page 1of 1
DIRGH RAJ KUSHWAHA RECEP re Meera in Agee one | EDUCATION SUMMARY * Bachelor OF Technology Specialist Programmer working as Data Engineer with over 2.5 years of (Information Technology) experience in designing, building, and maintaining data pipelines. Proficient RTU 2016-2020 in programming languages Python and SQL, with a strong understanding of database technologies. Skilled in cloud platforms(Azure/GCP/AWS). Experienced in developing ETL processes and data modeling, CORE SKILLS EXPERIENCE ‘+ Hadoop * Infosys- g Specialist Programmer aruary 2022 to Present) * Spark Digital Specialist Engineer (janusry 2021 to January 2022) ‘+ PySpark/Spark-SQL © Abinitio. Currently working fora Telecom client ‘© Shell Scripting Work & Responsibilities # SQL Scripting * Worked on building ETL pipelines to load data from various sources fe) Propeuinglanpureoe to a cloud data warehouse, ensuring data quality and integrity © Python + Implemented GDPR regulations by building a data pipeline using ° CH Databricks/GCP/Abinitio for deletion and anonymization of data © Cloud plattrom upon request. © Azure * Created an Eneryption-Decryption Framework for PIT data and © GCP handled a migration project from Netezza DB to a Cloud ° AWS environment using Abinitio. © Azure Databricks Skill leveraged: Python, SQL, Azure Databricks, GCP, PySpark, ETL, © Apache Kafka Ab Initio, Senile) Worked for Networking client Work & Responsibilities RECS MUSH MENTS * Created ETL pipelines using Azure Data factory, ingesting data from © Code Chef 4 stars. various sources to the staging layer and processing pyspark, spark * Arctic Code Vault Contributor on SQL» GitHub. * Developed facts and dimensions and pushed data to the warehouse © Google Code Jam -2018 for the end users. Qualified with rank 2435. * Build out the data and reporting infrastructure from the ground using © Finalist in Rajasthan Digifest PowerBI and SQL to provide real-time insights into the product Hackathon(2019) Skill leveraged: Azure Data factory, PySpark, SparkSQL, PowerBI * Microsoft Azure Data PROJECTS Fundamental certified (DP-900). Build a DATA PIPELINE Using Open Source tools * Build a DATA PIPELINE usi Collecting data from various sources and dumping it into a single warehouse, Clickhouse Open source Data Warehouse, Superset open source Data Visualization Tool. # Integrated all these three into a s the Data Integration tool Jitsu - gle flow and deployed.

You might also like