Machine Learning
Interview Questions
Parametric and Non-Parametric Models

What is the difference between a parametric and non-parametric model?

Parametric and non-parametric models are two different approaches to building a model from data. A parametric model has a fixed number of parameters, while a non-parametric model's number of parameters grows with the amount of data.

Parametric models make strong assumptions about the functional form of the relationship between the input and output variables. These models typically have a fixed number of parameters that are learned from the data, and once the parameters are learned, the model is fixed. Examples of parametric models include linear regression, logistic regression, and support vector machines.

Non-parametric models, on the other hand, make fewer assumptions about the functional form of the relationship between the input and output variables. Instead, non-parametric models rely on the data to determine the number of parameters required to fit the model. These models are often more flexible than parametric models and can capture more complex patterns in the data. Examples of non-parametric models include decision trees, random forests, and k-nearest neighbors.

The choice between a parametric and non-parametric model depends on the nature of the problem you are trying to solve. If you have prior knowledge about the functional form of the relationship between the input and output variables, a parametric model may be more appropriate. On the other hand, if you have little or no prior knowledge about the functional form of the relationship, a non-parametric model may be more suitable.

In practice, it is common to use a combination of both parametric and non-parametric models. For example, a parametric model may be used to make initial predictions, and then a non-parametric model may be used to refine the predictions based on the residuals. This approach can help balance the bias-variance tradeoff and lead to better overall performance.