Recall the general OMD template:
<aside> ⚙ Meta-Algorithm: Online Mirror Descent (OMD)
Parameters: strictly convex regularization $R$ over $W \subseteq \R^d$, stepsize $\eta>0$
Initialize $w_1 \in W$
For $t=1,2,\ldots,T$:
$$ \newcommand{\E}{\mathbb E} \begin{align*} \nabla R(w_{t+1}') &= \nabla R(w_t) - \eta g_t \tag{step} \\ w_{t+1} \qquad\;&= \argmin_{w \in W} \; D_R(w,w_{t+1}') \tag{projection} \end{align*} $$
</aside>
Recall:
<aside> 💡 Definition: Bregman divergence
The Bregman divergence $D_f$ associated with a convex and differentiable $f: S \to \R$ is:
$$ \begin{align*} \forall x,y \in S:\quad D_f(y,x) = f(y)-f(x)-\nabla f(x) \cdot (y-x) \end{align*} $$
</aside>

We will analyze OMD for regularizers $R$ that are strongly convex with respect to a norm $\|\cdot\|$ :