Professional Documents
Culture Documents
Practical No.2
Tital:-Building classifiers for two classes using bayes Rule
Aim:- Study the classification of two classes using bayes rule
Objectives:-
- To understand the Bayes rule
Problem Statement:- To study the classification of two classes using bayes rule
Theory:-
Naive Bayes is a statistical classification technique based on Bayes Theorem. It is one of
the simplest supervised learning algorithms. Naive Bayes classifier is the fast, accurate and
reliable algorithm. Naive Bayes classifiers have high accuracy and speed on large datasets.
Naive Bayes classifier assumes that the effect of a particular feature in a class is
independent of other features. For example, a loan applicant is desirable or not depending on
his/her income, previous loan and transaction history, age, and location. Even if these features
are interdependent, these features are still considered independently. This assumption simplifies
computation, and that's why it is considered as naive. This assumption is called class
conditional independence.
As we have studied about Bayes theorem in detail, let us understand the meanings of a
few terms related to the concept which have been used in the Bayes theorem formula and
derivation:
Conditional Probability
Joint Probability
When the probability of two more events occurring together and at the same time is
measured it is marked as Joint Probability. For two events A and B, it is denoted by joint
probability is denoted as, P(A∩B).
Random Variables
The above equation (a) is called as Bayes' rule or Bayes' theorem. This equation is basic
of most modern AI systems for probabilistic inference.
It shows the simple relationship between joint and conditional probabilities. Here,
P(B|A) is called the likelihood, in which we consider that hypothesis is true, then we
calculate the probability of evidence.
P(A) is called the prior probability, probability of hypothesis before considering the
evidence
In the equation (a), in general, we can write P (B) = P(A)*P(B|Ai), hence the Bayes' rule
can be written as:
Where A1, A2, A3,........, An is a set of mutually exclusive and exhaustive events.
Naive Bayes classifier calculates the probability of an event in the following steps:
Step 2: Find Likelihood probability with each attribute for each class
Step 3: Put these value in Bayes Formula and calculate posterior probability.
Step 4: See which class has a higher probability, given the input belongs to the higher
probability class.
Naïve Bayes is one of the fast and easy ML algorithms to predict a class of datasets.
○ Naive Bayes assumes that all features are independent or unrelated, so it cannot learn
the relationship between features.
○ Spam filtering - Bayes' theorem is commonly used in email spam filtering, where it helps
to identify emails that are likely to be spam based on the text content and other features.
○ Medical diagnosis - Bayes' theorem can be used to diagnose medical conditions based
on the observed symptoms, test results, and prior knowledge about the prevalence and
characteristics of the disease.
○ Risk assessment - Bayes' theorem can be used to assess the risk of events such as
accidents, natural disasters, or financial market fluctuations based on historical data and
other relevant factors.
○ Fraud detection - Bayes' theorem can be used to detect fraudulent behavior, such as
credit card or insurance fraud, by analyzing patterns of transactions and other data.
○ Now we will implement a Naive Bayes Algorithm using Python. So for this, we will use
the "user_data" dataset, which we have used in our other classification model. Therefore
we can easily compare the Naive Bayes model with the other models.
Steps to implement:
Conclusion:- we have studied how to classify two classes using the bayes rule and what is the
use of Bayes rule