 What is regularization? Regularization represents a broad set of techniques to curb overfitting. In the context of linear regression and a generalized linear models, it can involve just adding a penalty term to penalize the coefficients. In the context of decision trees, it involves pruning decision trees, thereby making them less deep. And so, small changes in data won't cause vast changes in the structure of the tree, thus curbing overfitting. In the context of neural networks, regularization can take the form of dropout, which means randomly switching off neurons during the training process so that you force the network to learn along different paths that helps the network generalize better, hence mitigating overfitting.