We analyze stochastic conditional gradient methods for constrained optimization problems arising in over-parametrized machine learning. show that one could leverage the interpolation-like conditions satisfied by such models to obtain improved oracle complexities. Specifically, when objective function is convex, we method requires O ( ϵ − 2 ) calls find an -optimal solution. Furthermore, includi...