Naive Bayes Classification

Prajwal Kb
1 min readJun 30, 2020

Naive Bayes is a classifiers and blongs to simple probabilistic classifiers family based on applying Bayes theorem with strong (naive) independence assumptions between the features.

Bayes Theorem formula

probabilities are calculated simply by counting number of occurences for categorical data.

predictor prior probability is not incorporated while calculating probabilities because it is same for both the classes and hence can be ignored without affecting the final outcome.

prior probability it is the probability of an event occurring before the collection of new data.prior plays an important role while classifying when using Naive Bayes, As it highly influences the class of the new test point.

Likelihood function gives the likelihood of a data point occurring in a category.The conditional independence assumption is leveraged while computing the likelihood probability.

posterior probability which is finally compared for the classes and the test point is assigned to the class whose posterior probability is greater.

Laplace Smoothing:
Helps to slove zero probability problem. Where there are words occurring in a test sentence which are not a part of the dicitonary then they will not be considered as part of the feature vector since it only considers the words that are part of the dictionary.These new words are ignored.

Thank you

--

--