Information Criterion for Boltzmann Approximation Problems
This work considers the problem of approximating a density when it can be evaluated up to a normalizing constant at a limited number of points. We call this problem the Boltzmann approximation (BA) problem. The BA problem is ubiquitous in statistics, such as approximating a posterior density for Bayesian inference and estimating an optimal density for importance sampling. Approximating the density with a parametric model can be cast as a model selection problem. This problem cannot be addressed with traditional approaches that maximize the (marginal) likelihood of a model, for example, using the Akaike information criterion (AIC) or Bayesian information criterion (BIC). We instead aim to minimize the cross-entropy that gauges the deviation of a parametric model from the target density. We propose a novel information criterion called the cross-entropy information criterion (CIC) and prove that the CIC is an asymptotically unbiased estimator of the cross-entropy (up to a multiplicative constant) under some regularity conditions. We propose an iterative method to approximate the target density by minimizing the CIC. We demonstrate that the proposed method selects a parametric model that well approximates the target density. This is joint work with Yen-Chi Chen (University of Washington) and supported by the National Science Foundation (DMS-1952781).