The curse of dimensionality refers to problems that arise when the number of input features increases.
In K-NN, distance calculations between points become less meaningful in high-dimensional space.
As dimensions increase, data points spread out and become equally distant from each other.
This makes it hard for K-NN to find true nearest neighbours, which reduces accuracy.
Also, more dimensions mean more computation and storage requirements, which can make K-NN slower, not faster.
Options (A), (C), and (D) are incorrect because they suggest improvements or benefits that do not occur.
Therefore, K-NN struggles with prediction accuracy when dealing with too many input variables due to the curse of dimensionality.