The committee has an equal weight for every prediction from all models, and it gives little improvement than a single model. Then boosting was built for this problem. Boosting is a technique for combining multiple 'base' classifiers to produce a form of the committee that
The committee is a native inspiration for how to combine several models(or we can say how to combine the outputs of several models). For example, we can combine all the models by
Bayesian model averaging(BMA) is another wildly used method which is very like a combining model. However, the difference between BMA and combining models is significant.
The mixture of Gaussians had been discussed in the post 'Mixtures of Gaussians'. It can not only be used to introduce 'EM algorithm' but contain a strategy to improve model performance.