You are on page 1of 14

Project Life Cycle

What is AI Project Cycle?

 It is the life cycle of an AI project which makes it easier for doing AI


based projects with ease using different steps of approach to the
solution.The project has been classified into 5 procedural steps.
 Problem Scoping
 Data acquisition
 Data exploring
 Modelling
 Evaluation and Deployment
Flow chart
Problem Scoping

 It is a fact that we are surrounded by problems. They could be small


or big, sometimes ignored or sometimes even critical. Many times,
we become so used to a problem that it becomes a part of our life.
Identifying such a problem and having a vision to solve it, is what
Problem Scoping is about. Scoping a problem is not that easy as we
need to have a deeper understanding around it so that the picture
becomes clearer while we are working to solve it. Hence, we use
the 4Ws Problem Canvas to help us out.
4 W’s Canvas

 The 4W’s of Problem Scoping are Who, What, Where and Why. This helps in
identifying and understanding the problem in a better and efficient
 manner.
 Who - “Who” part helps us in comprehending and categorizing who all
 are affected directly and indirectly with the problem and who are called
 the Stake Holders
 What - “What” part helps us in understanding and identifying thenature
 of the problem and under this block, you also gather evidence to prove
 that the problem you have selected exists.
 Where- "Where” does the problem arise, situation,context,and location.
 Why - “Why” is the given problem worth solving.
Problem Statement Template

 The Problem Statement Template helps us to summarize all the key


points into one single template.
 So that in the future, whenever there is a need to look back at the
basis of the problem, we can take a look at the Problem Statement
Template and understand its key elements of it.
Data Acquisition

 As the term clearly mentions, this stage is about acquiring data for
the project. Let us first understand what is Data. Data can be a
piece of information or facts and statistics collected together for
reference or analysis. Whenever we want an AI project to be able
to predict an output, we need to train it first using data.
 For Example:- For example, If you want to make an Artificially
Intelligent system which can predict the salary of any employee
based on his previous salaries, you would feed the data of his
previous salaries into the machine. This is the data with which the
machine can be trained. Now, once it is ready, it will predict his next
salary efficiently. The previous salary data here is known as Training
Data while the next salary prediction data set is known as the
Testing Data.
Data Exploration

 Data Exploration is the process of arranging the gathered data


uniformly for a better understanding. Data can be arranged in the
form of a table, plotting a chart, or making a data base.To analyse
the data, you need to visualise it in some user-friendly format so that
you can:Quickly get a sense of the trends, relationships and patterns
define strategy for which model to use at a later stage
communicate the same to others effectively.
Modelling

 The graphical representation makes the data understandable for


humans as we can discover trends and patterns out of it. But when it
comes to machines accessing and analysing data, it needs the
data in the most basic form of numbers (which is binary – 0s and 1s)
and when it comes to discovering patterns and trends in data, the
machine goes in for mathematical representations of the same. The
ability to mathematically describe the relationship between
parameters is the heart of every AI model. Thus, whenever we talk
about developing AI models, it is the mathematical approach
towards analysing data which we refer to.
Types of modelling

 Rule Based Approach


 Refers to the AI modelling where the rules are defined by the
developer. The machine follows the rules or instructions mentioned
by the developer and performs its task accordingly.
 Learning Based Approach
 Refers to the AI modelling where the machine learns by itself. Under
the Learning Based approach,the AI model gets trained on the
data fed to it and then is able to design a model which is adaptive
to the change in data.
Types of learning based approach.

 Supervised Learning
 In a supervised learning model, the dataset which is fed to the
machine is labelled. In other words, we can say that the dataset is
known to the person who is training the machine only then he/she is
able to label the data.
 There are two types of Supervised Learning models:
 Classification: Where the data is classified according to the labels.
 Regression: Such models work on continuous data.
Types of learning based approach.

 Unsupervised Learning
 An unsupervised learning model works on un labelled dataset. This means that
the data which is fed to the machine is random and there is a possibility that the
person who is training the model does not have any information regarding it.
 Unsupervised learning models can be further divided into two categories:
 Clustering: Refers to the unsupervised learning algorithm which can cluster the
unknown data according to the patterns or trends identified out of it.
 Dimensionality Reduction: We humans are able to visualise upto 3-Dimensions
only but according to a lot of theories and algorithms, there are various entities
which exist beyond 3-Dimensions
Evaluation

 Once a model has been made and trained, it needs to go through


proper testing so that one can calculate the efficiency and
performance of the model. Hence, the model is tested with the help
of Testing Data (which was separated out of the acquired dataset
at Data Acquisition stage) and the efficiency of the model is
calculated on the basis of the parameters mentioned below:
 Accuracy, Precision,Recall, F1 Score
Neural Networks

 Neural networks are loosely modelled after how neurons in the


human brain behave. The key advantage of neural networks are
that they are able to extract data features automatically without
needing the input of the programmer. A neural network is essentially
a system of organizing machine learning algorithms to perform
certain tasks. It is a fast and efficient way to solve problems for
which the dataset is very large, such as in images.

You might also like