📁 Akshay Dinesh Badgujar – Projects Portfolio
✅ Project 1: Bank of India – Internal Analytics & Automation System (Client Project)
During my tenure supporting one of India’s largest public sector banks, I was responsible for building and
maintaining automation pipelines and analytics dashboards within the Bank of India’s internal systems.
This client project played a critical role in monitoring and improving operational efficiency across
distributed branch networks, where reliability, security, and real-time performance were essential.
Role & Objective:
The project aimed to reduce manual dependency in day-to-day banking analytics workflows, provide
actionable intelligence to operations teams, and improve reporting transparency through a secure and
auditable infrastructure. As a Data Analyst/Support Engineer, I worked alongside senior IT teams to
ensure system uptime, deliver automation scripts, and facilitate reporting accuracy across high-volume
datasets.
Data Ingestion & Automation Pipelines:
One of my key responsibilities was building and optimizing secure data ingestion processes. Using
Python and batch automation tools, I developed scripts to extract transactional records from internal
Oracle-based systems and feed them into structured data marts. These jobs were automated using
scheduling frameworks with role-based access, ensuring compliance with banking-grade security
standards. Each job featured built-in logging and fallback logic to rerun failed batches—key for
maintaining observability.
Dashboards & Internal Reporting:
I built real-time dashboards in Power BI and Excel, presenting metrics such as transaction volumes, ATM
cash flows, and branch-level KPIs. The visualizations were used by regional managers to track day-to-day
performance and highlight anomalies. I implemented dynamic filters, DAX expressions, and interactive
drill-downs, which empowered leadership to monitor hundreds of branches simultaneously with minimal
manual intervention.
Anomaly Detection & Statistical Flagging:
To proactively detect irregularities such as ATM outages or unexpected transaction spikes, I applied
statistical modeling techniques like Z-score-based outlier detection and moving average thresholds using
Python and NumPy. These scripts were integrated into dashboards and alerts that automatically flagged
anomalies to system admins—improving reaction times and reducing customer-impacting issues.
Data Security & Compliance:
Working in the banking sector required strict adherence to data security protocols. I followed banking-
grade security practices, including encrypted storage, access logging, and time-bound credentials. All data
exchanges occurred on secured internal networks. I also maintained a changelog of every deployment to
support audit readiness and rollback capability.
Collaboration & Documentation:
I collaborated with IT managers, business analysts, and regional support staff to understand branch-level
data needs. All scripts, workflows, and dashboards were thoroughly documented using internal templates
and knowledge base articles, ensuring maintainability and smooth handover processes.
Observability & Uptime Monitoring:
To ensure the health of the automation pipelines, I developed monitoring scripts that checked task
statuses, logged exceptions, and sent email alerts in case of failures. This contributed to an SLA
adherence of over 98%, enabling uninterrupted business reporting cycles even during peak loads.
Impact & Outcome:
This project significantly reduced manual reporting workloads (by ~60%), improved issue detection time
by 40%, and supported over 300+ branches with real-time dashboards and automated alerts. It also gave
me hands-on exposure to secure, large-scale financial systems and reinforced my skills in operational
analytics, automation, and ML-aided insights delivery.
Technologies & Tools Used:
Languages & Frameworks: Python, Oracle SQL, Bash
Data & Visualization: Power BI, Excel, DAX
Analytics & ML: NumPy, Pandas, Statistical Flagging, Z-score Modeling
Infrastructure & Observability: REST APIs, Internal Batch Automation Tools, Email Notifiers
Security: Encrypted File Systems, Role-Based Access, Secure Logging
Relevance to AI/ML & Web Engineering Roles:
This project reinforced my ability to work with sensitive data, build observable and scalable systems, and
deliver real-time intelligence using analytics and ML techniques. It closely aligns with AI/ML engineering
and data science roles in high-availability industries such as finance, cloud infrastructure, and
cybersecurity.
✅ Project 2: AI-Powered Healthcare Intelligence & Automation System
This project involved the end-to-end development of a healthcare-focused generative AI system designed
to detect clinical anomalies and automate operational KPI reporting. My goal was to simulate how
observability pipelines can be integrated into healthcare infrastructure for smarter diagnostics, compliance
tracking, and data transparency—critical elements for hospitals, labs, and public health systems.
Use Case & Objective:
Healthcare operations often suffer from data fragmentation and delayed insight generation. The aim of
this project was to streamline real-time insights using AI to detect early signs of critical deviations in
clinical workflows. It also addressed the need for automated reporting pipelines that could plug into
hospital dashboards and internal audit systems.
Model Development:
Using transformer-based models like BERT variants and LSTM hybrids, I developed a generative AI
component that could simulate potential risk scenarios based on patient history and vitals. The models
were trained and tracked using MLFlow, enabling reproducibility and consistent evaluation. Statistical
modeling was used in parallel for comparative benchmarking and sanity checks.
Anomaly Detection & NLP:
I built a pipeline to analyze health sensor feeds and detect anomalies using a combination of PyTorch,
Scikit-learn, and LangChain for structured prompt engineering. This enabled contextual flagging of
outlier vitals (e.g., abnormal heart rates, oxygen levels) and delivered recommendations based on patterns
learned from structured patient datasets.
Automation & Observability:
I set up event-driven alert systems through REST APIs integrated with backend dashboards. The data
pipeline was hosted on Google Cloud Platform (GCP) and structured to ensure secure, HIPAA-aligned
observability for hospital admins. Every event and anomaly was logged, timestamped, and labeled for
audit trails.
Visualization & Reporting:
Using Power BI, I built interactive dashboards for doctors and operations managers that visualized live
vitals, predicted risk zones, and presented system confidence scores for each output. The dashboards
supported dynamic filtering and real-time refreshes, providing direct value to stakeholders.
Technologies & Tools Used:
Languages & ML: Python, SQL, TensorFlow, PyTorch, Scikit-learn, MLFlow
Libraries: LangChain, Pandas, NumPy
Infrastructure: GCP (Compute Engine, Cloud Functions), REST APIs
Visualization: Power BI
Security: Role-based access, data masking, anomaly logs
Impact & Alignment with AI/ML Roles:
This project sharpened my skills in LLM integration, real-time anomaly detection, and healthcare
analytics. It reflected how modern AI architectures can be operationalized in sensitive environments,
contributing directly to goals like observability, model transparency, and autonomous decision-making.
✅ Project 3 : A Comparative Study: Business Intelligence Tools
This academic-industry hybrid project involved a published comparative study of popular BI platforms,
where I evaluated the scalability, security features, and usability of various tools in cloud and hybrid
infrastructure environments. It laid the foundation for understanding how data observability and analytics
platforms are benchmarked for enterprise adoption.
Project Scope & Publication:
The research was published in IJRESM, Volume 5, Issue 1, January 2022, and it focused on
benchmarking BI tools across several criteria relevant to enterprises adopting AI and observability
frameworks. The tools studied included Power BI, Pentaho, Jaspersoft, and Snowflake, among others.
Evaluation Framework:
The tools were assessed across four dimensions:
Scalability (horizontal and vertical scaling across datasets)
Security Integration (support for RBAC, encryption, and audit trails)
Usability (user interface design, data transformation ease, accessibility)
Observability Readiness (ability to integrate with real-time pipelines and API feeds)
Security & Compliance Testing:
I conducted structured experiments to simulate RBAC models, enforced encryption layers, and tested
GDPR-compliance flags for each BI solution. This offered insight into how ready each platform was for
cloud-native enterprise environments.
Performance & Visualization:
Dashboards were built and tested using Power BI, Jaspersoft, and Pentaho, with datasets structured in
Snowflake and Excel. Performance was benchmarked based on response times, chart rendering, and
backend query efficiency.
Use of AI & Automation:
Although the study was not based on traditional ML models, AI was indirectly evaluated through each
platform’s support for predictive analytics, AI visualizations, and integration with external ML pipelines
—key for long-term infrastructure planning.
Technologies & Tools Used:
BI Platforms: Power BI, Pentaho, Jaspersoft
Data Platforms: Snowflake, Excel
Frameworks & Metrics: Security-Focused BI Benchmarks, RBAC, Scalability Indexing
Testing Tools: Custom Python scripts, SQL benchmarks
Impact & Alignment with Deep Observability & Product Roles:
This study gave me foundational insights into how large-scale organizations evaluate technology
platforms for analytical maturity and AI readiness. It aligns closely with roles that require an
understanding of product-market fit, infrastructure observability, and enterprise tooling evaluation—
especially in hybrid cloud environments
✅ Project 4: Ecotokari – Full Website & Database Development
At Ecotokari, a sustainability-focused startup, I served as the sole developer for a full-fledged web
solution. This end-to-end project involved crafting both the front-end and back-end from scratch,
showcasing my ability to conceptualize and deliver scalable, user-friendly, and technically sound web
infrastructure that aligned with real business needs.
Front-End Development:
Using HTML5, CSS3, and JavaScript, I created a clean, mobile-responsive interface that offered
consistent user experience across devices and browsers. I designed a modern layout with intuitive
navigation, ensuring optimal user engagement and visual consistency with the brand’s eco-centric
messaging. The front-end also featured custom animations, content sliders, and a fully accessible structure
to ensure inclusivity and SEO optimization.
Content Management:
To streamline content updates, I utilized WordPress for CMS flexibility, creating custom templates and
enabling non-technical stakeholders to manage product pages and announcements. I also integrated
Elementor for visual editing, empowering marketing teams to easily adjust homepage banners and
promotions without breaking design consistency.
Backend Development & Server Management:
The backend was developed using PHP and MySQL. I designed a robust relational database schema for
product inventory, user registrations, order forms, and content blocks. To facilitate seamless
communication between the database and the UI, I wrote secure PHP-based RESTful APIs and
implemented form validation with error handling to ensure data integrity. I also set up admin-facing
dashboards to manage product listings, review user feedback, and export reports in CSV format.
Security & Hosting:
Security was a key consideration throughout. I implemented server-side validation, user authentication,
and secure form processing to prevent injection attacks and unauthorized access. Hosting and deployment
were managed via cPanel on an Apache server, where I handled DNS configurations, SSL certificate
installations, and scheduled backups to ensure uptime and data reliability.
Performance Optimization & Analytics:
I optimized loading speeds through image compression, minimized render-blocking JavaScript, and
employed lazy loading strategies. Additionally, I installed Google Analytics and used its insights to iterate
on content placement and user journey improvements.
Impact & Alignment with Role:
This project demonstrated my capacity to take full ownership of a project lifecycle—from ideation to
deployment. It highlights my experience with CMS platforms, full-stack development, cross-team
collaboration, responsive design, security compliance, and performance optimization. These experiences
are directly relevant to a Web Specialist role requiring versatile technical skills, user-centered thinking,
and content delivery across dynamic platforms.
✅ Project 5: SCD Nutritional Pvt. Ltd. – Website and Cross-Platform Mobile App Development
At SCD Nutritional Pvt. Ltd., I led the end-to-end development of both the corporate website and a cross-
platform mobile application designed to support their health-focused product line. This comprehensive
project allowed me to apply full-stack, mobile, and ML-aligned capabilities in a real-world business
setting.
UX/UI Design & Prototyping:
I began with deep collaboration alongside the business and product teams to understand user needs and
brand objectives. I converted those insights into detailed wireframes and clickable design prototypes
using Figma and Adobe XD, ensuring a smooth transition into front-end development with a pixel-
perfect UI aligned with accessibility and responsiveness standards.
Front-End Web Development:
For the website, I used HTML5, CSS3, and JavaScript to develop the public interface, while leveraging
PHP to integrate backend functionality and secure form submissions. I emphasized mobile-first design
and interactive UI elements that provided seamless engagement across devices. Brand consistency was
maintained throughout with color schemes, iconography, and custom design elements.
Mobile App Development with ML Integration:
The mobile application was built using React Native, making it deployable on both iOS and Android. It
featured:
Real-time nutrition tracking tools that used Python-based ML models deployed via Firebase
Functions to recommend personalized dietary suggestions.
A customer support chatbot, integrating rule-based NLP responses powered by a lightweight
transformer model.
Push notifications and behavioral insights, triggered by user activity and supported by Firebase
Cloud Messaging.
Backend & Database Systems:
The backend stack included Node.js with MongoDB, and data storage through Firebase Firestore. I
created secure APIs for registration, login, and tracking modules, while ensuring all data operations were
secure, scalable, and validated. The system architecture supported real-time data syncing and offline-first
functionality.
Admin Dashboard:
An interactive dashboard was built for internal users to manage content, monitor app engagement, and
track user submissions. Admins could push new content or update recommendations dynamically using a
browser-based panel.
Data Science & ML Integration:
I used historical nutrition datasets to train a lightweight ML model in Python for food classification and
health scoring, and deployed it using Firebase. This integration personalized meal recommendations and
demonstrated real-world application of ML for wellness and behavioral insights.
Impact & Alignment with Role:
This project strengthened my ability to build scalable, multi-platform solutions with robust backend
infrastructure and real-time ML integrations. It reinforced my full-stack web and mobile capabilities,
design-thinking approach, and readiness to contribute to AI/ML-enabled digital tools—making it highly
relevant to both software engineering and web specialist positions.
✅ Project 6: Wheelking Inc., Canada – Full-Stack Enterprise Application Development
At Wheelking Inc., I was part of a cross-border development team responsible for building and
maintaining an enterprise-level application for logistics and operations management. This project gave me
critical experience in enterprise full-stack workflows, data security, back-end APIs, and customer-focused
UI development in a production environment.
Front-End Architecture & Design:
Using Angular, JavaScript, Bootstrap, and HTML5/CSS3, I developed modular, reusable UI
components that supported responsive design and cross-browser compatibility. The system’s front-end
allowed staff and customers to manage orders, track shipments, and generate quotes dynamically. I
designed complex views with sorting, filtering, and validation logic to handle thousands of data rows
efficiently.
Backend Development with .NET & SQL Server:
On the server side, I built and extended APIs using .NET Core, integrating RESTful services for secure
CRUD operations. I also created stored procedures and managed backend logic in SQL Server to handle
real-time order processing, logistics data, user management, and shipping integrations. Special care was
taken to ensure data consistency and transaction integrity using ACID principles and rollback-safe
procedures.
AI/ML Integration for Operations Analytics:
To support logistics optimization, I implemented a prototype feature that analyzed historical delivery and
route data using Python and Scikit-learn. The model provided predictive estimates for delivery delays
and suggested route adjustments. This functionality was integrated into internal dashboards to inform
dispatch decisions and improve planning accuracy.
Security, Observability & Logging:
Security was implemented using JWT tokens for authentication and role-based access controls. I also
built logging mechanisms for both frontend and backend events using Serilog and Angular Interceptors,
which improved issue traceability and observability—a skill directly related to modern DevOps
workflows.
DevOps & Deployment:
I participated in the deployment process using Azure DevOps, where I assisted in CI/CD pipeline
configurations and release versioning. Code reviews were conducted in a collaborative GitHub
environment with pull requests, automated tests, and peer feedback cycles. These practices helped me
adopt real-world software lifecycle methodologies.
Performance Monitoring & Optimization:
I conducted load testing for key backend APIs and implemented caching strategies using MemoryCache
to reduce query response times by over 30%. UI performance was improved through debouncing
techniques, lazy loading, and modular bundle optimizations in Angular.
Impact & Alignment with Role:
This project gave me strong hands-on experience in full-stack development for scalable, data-heavy
enterprise applications. It sharpened my understanding of software architecture, business logic
integration, and real-world implementation of predictive analytics. My contributions supported both
technical depth and system-level design thinking, preparing me for roles that combine software
engineering, machine learning, and cloud infrastructure—perfectly aligned with the responsibilities
expected in roles like AI/ML Engineering Internships and Web Specialist positions.
✅ Project 7: Code.EX UTD – Website Maintenance & Feature Enhancements
As a Website Developer for Code.EX, a student-led tech organization at UT Dallas, I was responsible for
maintaining and improving the club’s digital presence while integrating lightweight AI/ML insights and
ensuring platform scalability. This project provided me with hands-on experience in managing web
content systems while subtly applying data science and observability principles for better engagement.
Content Management & Web Development:
Using HTML5, CSS3, JavaScript, and WordPress with Elementor, I handled weekly updates to the
website’s events, announcements, and blog pages. I customized templates and styling elements to align
with brand guidelines and ensured the site remained visually cohesive and mobile responsive. This helped
support the ongoing SDLC for our digital assets and provided a reliable user experience for hundreds of
student members.
Data Analytics & Engagement Optimization:
To understand how students engaged with our content, I integrated Google Analytics and built
customized dashboards using Power BI and Excel, allowing me to visualize traffic sources, popular
content areas, and user interaction metrics. These insights were shared with the leadership team to drive
data-informed decisions regarding site layout and event promotions.
ML & NLP-based Personalization (Prototype):
To prototype content personalization, I created a lightweight recommendation system using Python,
Scikit-learn, and Pandas that clustered user feedback and suggested personalized event tags and
categories. Though not deployed in production, the model's backend integration was tested using REST
APIs and MLFlow for tracking model performance during local evaluations.
Performance Optimization & Observability:
I implemented layout restructuring and image compression that reduced page load times by over 35%.
Basic observability was enhanced through custom logging and alert mechanisms using JavaScript
listeners and Google Tag Manager, helping the team identify and debug common drop-off points in the
user journey.
Security & SEO Compliance:
I ensured proper meta tagging, HTTPS enforcement, and updated plugins to protect against
vulnerabilities. Basic SEO best practices were followed to improve visibility, ensuring alignment with
broader communications goals of the organization.
Cross-Functional Collaboration & Agile Practice:
My work followed an agile model with weekly sprints and retrospectives, collaborating with team leads
and event coordinators to align development work with club milestones. Tasks were tracked using
Smartsheet, and Git was used for version control.
Tools & Technologies Used:
Frontend & CMS: HTML5, CSS3, JavaScript, WordPress, Elementor
Backend & APIs: PHP, REST APIs, JSON
Data Science & ML: Python, Scikit-learn, Pandas, MLFlow
Visualization: Power BI, Excel
Libraries & Infrastructure: NumPy, Keras, TensorFlow, LangChain (tested integration), Git,
Smartsheet
Cloud & Hosting: GCP (static hosting tests), Microsoft Azure (project sync and Git repos)
Impact & Alignment with Role:
This role enhanced my ability to combine full-stack development with analytics, observability, and
lightweight ML integration. It allowed me to experiment with personalization using NLP, understand
traffic behavior, and operate within an agile, collaborative setting. These contributions align closely with
software engineering and AI/ML-driven web roles requiring a blend of technical depth, communication,
and user-first design thinking.
✅ Project 8: Product Management Club (PMC) at UT Dallas – Website Operations & Redesign
As the Website Lead for the Product Management Club (PMC) at UT Dallas, I managed the full lifecycle
of the club’s digital infrastructure—from routine content operations to a partial site redesign aimed at
improving usability, accessibility, and engagement. This project allowed me to integrate modern CMS
tools, apply data analytics, and experiment with backend scripting and ML-based enhancements for
content optimization.
Website Maintenance & Content Management:
The platform was developed using WordPress and customized through Elementor and PHP, allowing
me to maintain the live site with dynamic updates to member directories, registration forms, and event
calendars. I established structured content workflows in Smartsheet, ensuring timely publication of club
events and campaigns. These updates aligned with user-first principles and were built with accessibility
compliance in mind.
Redesign for UX & Accessibility:
I led the redesign of the homepage and event pages, improving the visual hierarchy and user flow.
Designs were wireframed in Figma and translated into the production environment using custom
JavaScript, HTML5, and CSS3. Navigation was optimized for both desktop and mobile using
responsive layout practices and tested across multiple devices for cross-browser consistency.
Data Analytics & Engagement Tracking:
To guide redesign decisions, I implemented Google Analytics and exported behavioral insights into
Power BI dashboards, helping club officers understand which pages drove the most traffic and where
users dropped off. These dashboards were updated monthly and helped inform design, content strategy,
and call-to-action placements.
Backend Scripting & Dynamic Templates:
I customized backend functionality using PHP to enable dynamic member listings, automated event
registration confirmation, and role-based content access. These scripts supported secure interactions
between the front-end forms and the database while reducing manual update overhead for the executive
team.
Lightweight AI/ML Exploration:
To enhance content discoverability, I prototyped a keyword classification system using Python, Scikit-
learn, and TF-IDF vectorization on existing event metadata. This model categorized events and
suggested optimal tags for future publishing. While this prototype wasn’t deployed, it illustrated how ML
could be applied for content recommendation and internal search optimization.
Observability & Performance:
I ensured optimal performance by minimizing plugin load, optimizing media assets, and using lazy
loading techniques. I also tested error tracking scripts using custom logging via JavaScript and
Cloudflare analytics to improve uptime awareness.
Tools & Technologies Used:
Frontend & Design: WordPress, Elementor, HTML5, CSS3, JavaScript, Figma
Backend & Automation: PHP, REST APIs, Git
Analytics & ML: Power BI, Google Analytics, Python, Scikit-learn, NumPy, Pandas, MLFlow
(experiments), TF-IDF
Infrastructure: Smartsheet, Google Cloud Platform (GCP), Microsoft Azure, GitHub
Impact & Alignment with Role:
This project deepened my experience managing a CMS-driven ecosystem while aligning with software
engineering and AI/ML principles. It combined the strategic priorities of a university communications
team with technical execution across web, analytics, and emerging ML applications. My work helped
improve user engagement, streamline operations, and showcase how AI can enhance even small-scale
content platforms—preparing me for roles that demand full-stack versatility and a data-first mindset.