The widespread use of machine learning algorithms calls for automatic change detection algorithms to monitor their behavior over time. As a machine learning algorithm learns from a continuous, possibly evolving, stream of data, it is desirable and often critical to supplement it with a companion change detection algorithm to facilitate its monitoring and control. We present a versatile score-based change detection method that can detect a change in any number of (hidden) components of a machine learning model. The proposed statistical hypothesis test can be readily implemented for any machine learning model implemented in a differentiable programming framework. We establish the consistency of the hypothesis test and show how to calibrate it based on the theoretical results. We illustrate the versatility of the approach on linear regression models, time series models, text topic models, and latent-variable models on synthetic and real data.