K-Nearest Neighbours (K-NN) is a simple and intuitive machine learning algorithm.
It is called a lazy learner because it does not build a model during training — it simply stores the training data and delays computation until prediction time.
When a new data point needs to be classified, K-NN searches for the nearest neighbours in the stored training data.
It is non-parametric because it does not assume any fixed form for the underlying data distribution.
Instead, it uses the entire training dataset to make decisions.
This is why K-NN can adapt well to complex data structures but can be computationally expensive for large datasets.
Therefore, the correct answer is Lazy learning and non-parametric learning.