Explain Laplacing Smoothing
Laplacing Smoothing:
Naïve Bayesian prediction requires each conditional probability to be non-zero. Otherwise, the predicted probability will be zero. For example, suppose a dataset with 1000 tuples, total rows with Fever YES is 0, total rows with Cough= NO is 900, and total rows with Breathing issue - YES is 100, then probability will become zero and we cannot perform correct classification. So, to overcome such a problem we can use Laplacian correction also known as Laplacing Smoothing. In Laplacing smoothing, we add one extra tuple to each case in the training dataset. Now, the corrected probability will become as shown below:
Comments
Post a Comment