Question:

The naive Bayes classifier is used to solve a two-class classification problem with class-labels \( y_1, y_2 \). Suppose the prior probabilities are \( P(y_1) = \frac{1}{3} \) and \( P(y_2) = \frac{2}{3} \). Assuming a discrete feature space with

\[ P(x | y_1) = \frac{3}{4} \quad \text{and} \quad P(x | y_2) = \frac{1}{4} \]

for a specific feature vector \( x \). The probability of misclassifying \( x \) is: (Round off to two decimal places)

Show Hint

In Naive Bayes classification, the probability of misclassification is the complement of the maximum posterior probability. Always calculate the posteriors first before deciding the predicted class.
Updated On: Apr 4, 2025
Hide Solution
collegedunia
Verified By Collegedunia

Solution and Explanation

To find the probability of misclassification, we apply Bayes' Theorem to compute the posterior probabilities for each class and use the formula for misclassification. \[ P(x) = P(x | y_1) P(y_1) + P(x | y_2) P(y_2) \] \[ P(x) = \left(\frac{3}{4} \cdot \frac{1}{3}\right) + \left(\frac{1}{4} \cdot \frac{2}{3}\right) = \frac{5}{12} \] The posterior probabilities are: \[ P(y_1 | x) = \frac{0.6}{1} = 0.6, \quad P(y_2 | x) = \frac{0.4}{1} = 0.4 \] Thus, the probability of misclassifying \( x \) is \( P(\text{misclassify}) = 1 - \max(0.6, 0.4) = 0.4 \).
Was this answer helpful?
0
0

Top Questions on Machine Learning

Questions Asked in GATE DA exam

View More Questions