Ensemble of Models
An "ensemble" is a set of trained models which are combined in a clever way. The goal of ensemble learning
is to aggregate learned models in order to improve the prediction accuracy on unknown input data
(generalization). The success of combining multiple models depends on the error of each individual and the
correlation between them. In a good ensemble, the pool of prediction models are accurate individually and
learn different structure from the data to add valuable information. So in practice, how well an ensemble
performs is data dependent. For example if a neuronal network has enough expressiveness to learn the
underlying function of the given problem well enough, there is less or no room for improvement. Another point why ensembles can fail is due to the small sample size of the training data. Consider the case of only a few hundreds available train examples. Then the combination of
models can lead to overfitting. But in general if the
combination schema is applied correctly, the ensemble's prediction is better than each individual in
it.