You are on page 1of 5

Mobile: +917387666102

Email: kaushikpasi@gmail.com
LinkedIn: www.linkedin.com/in/kaushikpasi

Overview
Firmly believer of the term ‘Big Data Science’, merging the goodness of Data Science at the scale of Big Data. Responsible for building
‘Data’ capabilities at StockX. Enjoys building ‘cutting edge data’ solutions providing real-world impacts for clients globally. Expertise
is in designing and developing cloud-based high-performance data engineering, big data & analytics solutions that are performant,
scalable, secured, and compliant with the industry’s best practices. His recent client experiences include building big data lakes and
AI platforms in the hybrid & multi-cloud environments for FinTech, e-commerce and home-appliance companies with use cases like
recommendation engines, risk & fraud analytics, and customer 360 use cases.

Work Experience
2022-Present Senior Cloud Data & AI Platform Engineer – StockX, Bengaluru
- Building Data & AI platform for Data Engineering and Machine Learning teams
- Architecting data solutions for business use cases
- Creating & managing entire platform on AWS using ISaC (Terraform)
- Implementing Data Catalog & Data Governance

2018-2021 Big Data & Analytics Lead – Oneture Technologies., Mumbai


- Consulting big data and analytics platform solutions with data science & pre-sales activities
- Architecting, developing, and fine-tuning big-data lakes and various end-to-end big data
applications
- Developing machine learning applications
- Handled various data sources ranging from structured, semi-structured to unstructured with
both batch & stream processing
- Handled over 6 major clients with diverse problem statements
- Have done over 30+ successful POCs and 20+ projects
2016-2017 Security Consultant – IBM India Pvt. Ltd., Mumbai
- IBM QRadar implementation & integration of components in deployment
- Log-source integration and UDSMs
- Rules and Reports creation and fine-tuning
- Handled over 5 major clients

Education
Master of Technology (M.Tech.) Software Engineering
2016
CGPA: 7.75
Veermata Jijabai Technological Institute – Mumbai, MH

Bachelor of Engineering (B.E.) Computer Engineering


2014
Percentage: 68.13%
Mumbai University
Technical Skills
- Big Data - Spark, Hive, Kafka, Oozie, Sqoop, Flume, - Clouds – AWS, Azure, Google Cloud
Presto, NiFi, Airflow, Databricks - Others – Jenkins, Elastisearch, Terraform
- AWS – EMR, Glue, Athena, Lambda, S3, VPC, EC2, - Languages: Python, Scala, SQL/PLSQL, Java, C/C++
Lambda, Sagemaker, Quicksight, Kinesis, Redshift,
Lake Formation - Machine Learning, Neural Networks, Deep learning
- Azure – Databricks, Data factory, Data Factory, - Natural Language & Image Processing
Synapse, Azure ML, DevOps - Visualization Tools – Tableau, PowerBI,
Quicksight, Qlik

Projects
2022 Organizational Lakehouse Implementation
Responsibilities:
- Building Data & AI platform for Data Engineering and Machine Learning teams
- Architecting data solutions for business use cases
- Creating & managing entire platform on AWS using ISaC (Terraform)

2021 Project architecting & Center of Excellence


Responsibilities:
- Architect technical solutions for projects & initiatives
- Setup ways of working with best practices
Outcomes:
- Successfully delivering technical solutions to teams

2020-21 BI platform solution


Responsibilities:
- Architected project solution
- Lead development team for implementing the designed solution.
- Work with business to develop BI solution for various projects & their BI needs.
- Enhancing the platform from features, self-service & better governance perspective.
Outcomes:
- Successfully rolled out platform to be used at organizational level
2020-21 Platform Security & Governance
Responsibilities:
- Develop POC for unified security of various bigdata & non-bigdata platform components.
- Design solution & implement the same.
Outcomes:
- Successfully completed POC

2020 Self Service & Data Discovery


Responsibilities:
- Develop POC for data discover platform.
- Design solution & implement the same.
Outcomes:
- Successfully completed POC

2020 Consumer experience and contact center enhancement


Responsibilities:
- Designed solution along with other architects.
- Lead development team for implementing the designed solution.
- Process data from Genesys PureCloud, Google DialogFlow & integrate in ClaraBridge and QlikSense
reports
- Compute various KPIs related to understand consumer journey, direct sales, manage cost, increase
efficiency of contact centers.
Outcomes:
- Successfully completed & delivered the project to business.
- Improved sales & serviceability by providing simply outstanding consumer journey

2020 IoT enabled smart home appliance analytics.


Responsibilities:
- Lead development team for implementing the designed solution.
- Compute various KPIs related to product usage, defects, service, improvement, etc.
Outcomes:
- Successfully completed & delivered the project to business

2019 Fast processing semi-structured data pipelines


Responsibilities:
- Create big data applications to handle and process semi-structured data in JSON and XML.
- Convert the raw data into de-normalized forms for making it easy to query and source BI tools for insights.
Outcomes:
- Successfully completed POCs for 2 use cases with 1 in production
- Achieved more than 1000x performance improvement from existing application which helped client to
get insights from once every quarter to daily

2019 Content management platform


Responsibilities:
- Design and implement an end-to-end application for reading various documents and storing the key-
value data into a NoSQL database
- Demonstrate a successful working POC for the same
Outcomes:
- Successfully completed POC, application under final development phases
Application execution price is more than 1000x cheaper than any existing solution.

2018-19 Customer 360 (2)


Responsibilities:
- Designing and implementing entire data backend for two Customer-360 projects
- Provide all the required data components with additional key insights with analytical results for
improving up-sell and cross-sell of products by analyzing customer’s spend behavior with the help of
machine learning models
- Showcasing unified customer timeline from all available channels
- Recommending best next actionable to the client for personalized by each customer
Outcomes:
Successfully completed POC for one client and another one currently in UAT

2018-19 Various analytics application development (4+)


Responsibilities:
- Create ML models for following use cases: Customer Segmentation, Customer Recommendation for cross-
sell & up-sell, Suspicious Transaction Detection, Credit Card Fraud Analytics
- Migrate 10+ existing trained ML models from on-premises to AWS
Outcomes:
- Successfully implemented all models, currently in UAT
Migration activity in progress, currently in initial phases

2018 Big data CI/CD pipeline


Responsibilities:
- Creating CI/CD pipeline for big data applications
- Automate the process of application lifecycle for UAT & Prod
Outcomes:
Successfully developed the pipeline and handed it over to client’s release team for operations and administration
2018-19 Big data-lakes on clouds (3)
Responsibilities:
- Designing and implementing big data-lake on AWS & Azure
- Planning and implementing end-to-end big data ETL pipelines
Outcomes:
- Successfully implemented two data-lakes in production and one in initial phases of development
It is servings as organization’s self-service platform for all their date needs

2018-2021 Technical pre-sales & consulting


Responsibilities:
- Meet new potential clients, discuss & understand their requirements
- Design & propose solution
- Convert opportunities into potential clients
Outcomes:
Got 5 new clients for my parent company

2016-17 SIEM Implementation (5)


Responsibilities:
- QRadar Administrator for installation, integration and deployment of SIEM architecture in the banking
security environment.
- Log source integration for supported devices and UDSMs for unsupported ones.
- Use cases and business/device specific reports creation and fine-tuning.
- Integration and setting up of QRM, QVM, QRIF in QRadar.
- Troubleshooting issues related to deployment and operations of QRadar.
Outcomes:
- Successfully deployed SIEM with total ownership.
- Created standard operating procedures for strategic/important tasks performed.

Certifications
Data Engineer

AWS Certified Solutions Architect – Professional

OpenHack: Modern Data Warehousing


Microsoft Certified: Azure Fundamentals

Machine Learning A-Z™: Hands-On Python & R in Data Science

Baseline: Data, ML, AI


BigQuery For Data Analysis
Data Science on the Google Cloud
Machine Learning for Engineering and Science Applications

AWS Certified Solutions Architect – Associate

Apache Spark 2.0 + Scala: DO Big Data Analytics & ML

Taming Big Data with Spark Streaming and Scala

Edureka - Big Data Hadoop Certification


Personal Information
th
Date of Birth 19 September 1992

Nationality Indian

Passport M1261485

Languages English, Hindi, Marathi

Marital Status Unmarried

Address 1601, Lotus A, Nisarg Greens – Phase 2, Morivali, Ambernath (East), Maharashtra, India - 421501

DISCLAIMER: I hereby declare that all the information provided in this CV is factual and correct to the best of my
knowledge and belief.

Date: 21sth March 2022 Kaushik Pasi

You might also like