Bayesian methods are widely used for selecting the number of components in a mixture models, in part because frequentist methods have difficulty in addressing this problem in general. Here we compare some of the Bayesianly motivated or justifiable methods for choosing the number of components in a one-dimensional Gaussian mixture model: posterior probabilities for a well-known proper prior, BIC, ICL, DIC and AIC. We also introduce a new explicit unit-information prior for mixture models, analogous to the prior to which BIC corresponds in regular statistical models. We base the comparison on a simulation study, designed to reflect published estimates of mixture model parameters from the scientific literature across a range of disciplines. We found that BIC clearly outperformed the five other methods, with the maximum a posteriori estimate from the established proper prior second.