MIT OCW Machine Learning Learning Roadmap
Phase 1: Foundations - Linear Algebra: 18.06 (Gilbert Strang) –
https://ocw.mit.edu/courses/18-06-linear-algebra-spring-2010/ - Probability & Statistics: 6.041 –
https://ocw.mit.edu/courses/6-041-probabilistic-systems-analysis-and-applied-probability-fall-2010/
- Calculus: 18.01 and 18.02 – https://ocw.mit.edu/search/?q=18.01+18.02 - Python Programming:
6.0001 – https://ocw.mit.edu/courses/6-0001-introduction-to-computer-science-and-programming-in
-python-fall-2016/
Phase 2: Machine Learning Core - Intro to ML: 6.036 –
https://ocw.mit.edu/courses/6-036-introduction-to-machine-learning-fall-2020/ - Graduate ML: 6.867
– https://ocw.mit.edu/courses/6-867-machine-learning-fall-2006/ - Intro to AI: 6.034 –
https://ocw.mit.edu/courses/6-034-artificial-intelligence-fall-2010/
Phase 3: Deep Learning & Applications - Deep Learning for Self-Driving Cars –
https://selfdrivingcars.mit.edu/ - MIT Deep Learning (6.S191) – http://introtodeeplearning.com/ -
Deep Learning Book – https://www.deeplearningbook.org/ - CS231n (Stanford, bonus) –
http://cs231n.stanford.edu/
Suggested Learning Order: 1. Phase 1: Core Math & Programming (1–2 months) 2. Phase 2: Intro
& Intermediate ML (1–2 months) 3. Phase 3: Deep Learning (2–3 months)
6-Month Rubber Tree Detection Project Plan
Goal: To develop an AI-based application that counts rubber trees from aerial/satellite images using
deep learning models (e.g., YOLO or UNet).
Month 1–2: ML & CV Foundations - Learn Python, NumPy, Pandas - Study ML Basics: Supervised
Learning, CNNs - Practice image classification (MNIST, CIFAR) Resources: - MIT 6.0001, 6.036,
FastAI course (https://course.fast.ai/)
Month 3–4: Deep Learning for Vision - Study object detection: YOLOv5, UNet - Practice labeling
trees in images using LabelImg or Roboflow - Start training small models on sample data
Resources: - MIT 6.S191, CS231n, YOLOv5 GitHub
Month 5: Rubber Tree Detection System - Use drone or satellite images of rubber plantations -
Train and validate tree detection model - Evaluate using precision, recall, and count accuracy
Month 6: App Deployment - Build app with Streamlit or Gradio - Load images, run detection, output
results - Export counts and geolocations - Bonus: Add CSV/GeoJSON export feature
Tools & Libraries: - Python, PyTorch, OpenCV, Streamlit, Roboflow, LabelImg - Pretrained models:
YOLOv5, DeepForest, Segment Anything
6-Month Kaggle Training Plan for Computer Vision
& ML
Goal: Build practical data science and computer vision skills by participating in Kaggle competitions,
courses, and projects to reinforce ML and deep learning knowledge.
Month 1: Kaggle Basics & Python Practice - Set up Kaggle account and explore UI - Complete:
Python, Pandas, and Intro to ML micro-courses - Join the Titanic Competition and submit your first
model
Month 2: Supervised ML Projects - Work on: House Prices, Spaceship Titanic, or Tabular
Playground Series - Focus on data cleaning, feature engineering, and model selection (Random
Forest, XGBoost) - Learn model evaluation: accuracy, RMSE, confusion matrix
Month 3: Deep Learning Foundations - Complete: Deep Learning micro-course - Start using
TensorFlow/Keras or PyTorch for classification - Join Digit Recognizer competition (MNIST)
Month 4: Computer Vision Projects - Join PetFinder, Cassava Leaf Disease, or HappyWhale
competitions - Train CNNs using transfer learning (ResNet, EfficientNet) - Learn image
augmentation and dataset preprocessing techniques
Month 5: Object Detection - Study YOLOv5/YOLO-NAS basics - Join competitions with bounding
box labels (e.g., Global Wheat Detection, Tree Crown Detection) - Learn annotation tools like
LabelImg or Roboflow
Month 6: Capstone + Portfolio - Choose a vision competition or your own tree-counting dataset -
Train model, evaluate, and build a report/notebook - Upload polished Kaggle Notebook with
documentation and visuals - Bonus: Publish on GitHub or Medium
Tips for Success: - Read winning solutions and discussions - Use Kaggle Notebooks and Datasets
for inspiration - Work consistently (1–2 hours daily or 10–15 hours weekly) - Focus on learning, not
leaderboard positions early on