The stochastic gradient descent (SGD) algorithm is widely used for parameter estimation, especially huge datasets and online learning. While this recursive popular computation memory efficiency, quantifying variability randomness of the solutions has been rarely studied. This article aims at conducting statistical inference SGD-based estimates in an setting. In particular, we propose a fully es...