Optimal transport (OT) is a versatile framework for comparing probability measures, with many applications to statistics, machine learning, and applied mathematics. However, OT distances suffer from computational and statistical scalability issues to high dimensions, which motivated the study of regularized OT methods like slicing, smoothing, and entropic penalty. In this talk, I will discuss several applications of regularized OT distances towards problems in non-parametric inference and generative modelling, and how regularization helps address some issues with vanilla OT. This includes joint work with Professor Ziv Goldfeld (Cornell ECE), Professor Kengo Kato (Cornell SDS), Sloan Nietert (Cornell CS), and Gabriel Rioux (Cornell CAM).