Panagiotis (Panos) Toulis
Harvard University,
Department of Statistics
ptoulis at fas dot harvard dot edu

Homepage Short bio Publications Code Miscellaneous

This page provides more info about the code and simulations presented in (Toulis et. al., 2014) on implicit stochastic gradient descent (SGD) for large Generalized Linear Models (GLM). Implicit stochastic gradient descent is a modification of the typical SGD algorithm that has very attractive properties. First, it is a shrinked version of the typical (explicit) SGD and the shrinkage factor depends on the observed Fisher information. Thus, implicit SGD is significantly more stable than explicit SGD. In the limit, both methods are statistically equivalent, but in small-to-moderate samples, implicit SGD is more stable, and robust to misspecifications of the learning rate, or outliers.

The following are the main parts of this code: