| Parametric Machine Learning Algorithms | Nonparametric Machine Learning Algorithms |
Benefit |
|
|
limitation |
|
|
Lazy Learning | Eager Learning |
On the basis of training set ,it constructs a classification model before receiving new data to classify. | On the basis of training set ,it constructs a classification model before receiving new data to classify. |
On the basis of training set ,it constructs a classification model before receiving new data to classify. | On the basis of training set ,it constructs a classification model before receiving new data to classify. |
On the basis of training set ,it constructs a classification model before receiving new data to classify. | On the basis of training set ,it constructs a classification model before receiving new data to classify. |
Q3. What is K-NN in Classification?A3) K-nearest-neighbour classification was actually developed from the need to perform discriminant analysis when reliable parametric estimates of probability densities are unknown or are difficult to determine. When K-NN is used for classification, the output is easily calculated by the class having the highest frequency from the K-most similar instances. The class with maximum vote is taken into consideration for prediction.The probabilities of Classes can be calculated as the normalized frequency of samples that belong to each class in the set of K most similar instances for a new data instance.
For example, in a binary classification problem (class is 0 or 1): p(class=0) = count(class=0) / (count(class=0)+count(class=1)) |