You are on page 1of 7

 

CRM Using AI  


To Improve Student 
Learning 
June 23, 2019 

Overview 
This document discusses in detail the product overview of a CRM system that helps students 
learn better by employing machine learning to address their needs better. The system relies on 
collecting information from students about their understanding of a concept, based on the 
feedback from the students regarding a particular topic from the course, we recommend them 
additional reading materials through our platform. The key features being: It is completely 
customised and caters to individual needs of the students and is not generic. Helps them 
understand concepts better. Here’s an overview of the process 

We use the students’ survey results and learning styles to email the students on ways to 
improve their understanding of classroom topics. For example, "John Doe" on a scale of 1-10 
gives an "8" in terms of his understanding of "Algebra Expressions and Equations" When his 
score is recorded and since we know how likes to learn through "audio"(This is one of the four 
learning styles) the system must crawl the internet and find resources for "Algebra- Expressions 
and Equations" that focus on "audio" ways of learning that is automatically sent to "John Doe" 
via an email The web application would be collecting data and storing it an easy to use CRM 
where reports can be made on the data. Eventually the more data collected and more learned 
about the students recommendation should be made to help them week by week as the surveys 
go out weekly. 

 
2  

Goals 
1. Build a robust CRM with data security and privacy protection:​ We will have a 
well-designed database schema and restrict the visibility of any sensitive information. 
The only intended audience will be able to get the details. The CRM which will be a rich 
repository of high-quality data is a valuable asset and its contents will also be hidden 
from any view in the web application. All sorts of checks and safety measures will be put 
into place for data security. We will assure the data pipeline will be tightly monitored and 
we will make sure there are no risks of data breaches. 

2. Use machine learning to better assess the needs of students :​ The machine learning 
part comes into play once we do the web scraping, once we have the links collected. The 
first task would be to classify the links into various learning methodologies, i.e visual, 
textual, audio based, etc. A decision tree or another ML model might be employed to 
make this categorization. We will be looking at various parameters, for example, domain 
names of the URLs, the tags used inside the particular webpage, the links the web page 
is leading to, etc. With these parameters (currently hypothetical need some research and 
testing to determine the correct parameters), we would be doing the first level of 
classification of a particular URL. Now we rank each of the URLs on the basis of 
complexity, i.e is it a beginner friendly resource or is towards the more technical end. To 
give a better sense of what we mean, grammarly.com has a feature where it classifies a 
piece of text and beginner, intermediate or advanced based on the choice of words, etc, 
our model based on the confidence levels will grade each of the URLs on a scale, say, 1 
-100. Once the student gives his feedback we have our model pick out the URL for that 
particular student. Once the URL has been visited by them they can give feedback if the 
material helped them, if they felt it was a bit advanced, or if they felt it was too basic. 
Based on this information gathered continuously the model gives better 
recommendations specific to the student. So if 2 students grade themselves an 8 on a 
particular topic they might not necessarily get the same URL recommendations as to the 
other. You ranting yourself an 8 is different from me rating myself 8 and the model will 
allow for these changes to give even more customized recommendations. 

 
3  

3. Robust web crawler to collect the URLs : T


​ he web crawler will take the keywords 
given by the teacher in advance and scrape the internet for relevant links, once these 
links are collected in a database table we will feed them to the machine learning system. 
This is going to be built from scratch and no proprietary tools such as S
​ creaming Frog 
which would incur additional charges would be used. 

4. Intuitive UI/UX : W
​ e wish to build our website with intuitive user interface and user 
experience so that the students,teachers and administrators can use the application with 
little to no training. 

5. Continually improving: ​The model should take the past surveys to retrain itself and give 
better recommendations. 

User story: 
Background: 

a. The teacher gives they keywords of the topics that were covered in the class, ex:[linear 
algebra,functions,variables,equations] to our system. Our web crawler then takes these key 
words and performs crawling and collects the relevant URLs. 

b. We use machine learning to classify the URLs (visual/reading/etc). And also assign a score to 
each of the URLs and store it in our database (the database would look something like the excel 
sheet that I shared except much larger and containing more fields) 

Example for an algebra class: 

Step 1: Students get a survey asking on a scale of 1-10 how do they rate their understanding of 
"linear equations and function" In this example let's say they provide an 8. 

Step 2: We match the student who rated themselves an 8 with a URL which has a score of 80 (A 
score 80 indicates it's towards the more complex/advanced topic on a scale of 1-100). We send 
the relevant URLs (which we have collected prior to sending out the survey in the background.) 

 
4  

Step 3: The students get an email/notified asking their opinion on the topic resources they 
received in the URL. The questions would be like "How relevant was the URL on a scale of 1-10. 
Did you find the resource a. Too advanced b. Good enough c. Too basic, based on this 
information we can fine-tune/ customize the model. ex. in this example the student who 
assessed himself and 8 was recommended resources with score 80-90. After going through the 
resources, he felt the resources were too advanced so the model adjusts for this person to give 
a URL from a lesser score say 70-80. (These numbers are arbitrary for the sake of the example, 
the machine learning model would perform a better task at making these adjustments.) Now 
let's say another student who rated himself 8 felt the resources were basic then his profile 
would be adjusted that he receives materials with score in the range of 90-100 so on, this 
refinement process is continual until the students get customized results to their satisfaction. 

 
ML and Data Pipeline: 
 
Video Status Method of Tag Descript
Title Domain Video URL Code Learning Score s ion Comments
AP Physics 1 review
of 1D motion | https://www.youtub
Physics | Khan /watch?v=N e.com/watch?v=Ny
Academy yPbcdE9v6o PbcdE9v6o 200 VISUAL 89
De Broglie https://www.youtub
wavelength | Physics /watch?v=iT e.com/watch?v=iT
| Khan Academy RxuBxttj8 RxuBxttj8 200 VISUAL 62
Finding torque for
angled forces | https://www.youtub
Physics | Khan /watch?v=Z e.com/watch?v=Z
Academy QnGh-t25tI QnGh-t25tI 200 VISUAL 23
AP Physics 1 review
of Momentum and /watch?v=q https://www.youtub
Impulse | Physics | Mc6KOkmjT e.com/watch?v=q
Khan Academy U Mc6KOkmjTU 200 VISUAL 60

The above section of the excel sheet which is representative of one of the key tables in the 
databases, The first column title is the video title,domain is the youtube extension for the video 

 
5  
followed by the actual video URL, the status code denotes if the URL is currently available 
200 - Success 404- not found. Score is the complexity score given by the ML model which is 
fed to the database. Giving a recommendation would be retrieving the URLs whose score matches 
the evaluation score of the student + adjustments based on the previous analysis.  

The way data would flow through the application is discussed in the data-flow diagram below: 

 
Milestones 

 
6  

Creating UI Screens 

A UI designer from my team will be designing the entire layout of the web application 
that you're trying to build. All the screens and the user story (how a user will interact with 
your application) will be ready by this milestone 

Sample Survey Form creation/Sample Data collection 

We can discuss what all relevant fields can be put in the questionnaire to the students 
which we help us train the machine learning model, this page can be used as it is to 
collect the training data for the machine learning model. 

Machine Learning Model Development and Tuning 

The brain of the product, based on the data we train it on and the parameters that we 
choose (the type of learning medium preferred), the model will be trained and tested to 
meet your expectations and utmost care will be taken at this phase. 

Front end development (React/Angular) 

Using a modern web framework like react or angular we will be building the front end 
(user interaction) part of the website. 

Back End Development (Django Python) 

The back end will be developed using a python based framework like Django and all the 
website functionalities will be finished by this phase. 

CRM - Phase 1 (Trusted Data Sources) 

The recommended content for the first phase will come from trusted sources like Khan's 
Academy, Udacity, etc. This will help maintain the quality of the content recommended. 

CRM - Phase 2 (Keyword based web scraping) 

A future enhancement could be to improve this by scraping the web and taking feedback 
on how the quality of content recommended was. 

 
7  

Scope and Duration 


We believe it is something we can deliver the first production-ready application in 3 or maybe 
even 2 months. (With 3 developers and one UI/UX designer in the team). However, if the scope 
of the project changes in between and new features are requested, the timeline might get 
extended further on. We can plan on a better timeline once we are able to break down the 
application into smaller tasks and deliverables. We can have daily call to keep you posted on the 
updates regarding the project, weekly demos to discuss the progress and get feedback on the 
product development cycle. We will work towards shipping a market-ready product as soon as 
possible. 

You might also like