Recap & scope

We revisit Stochastic Optimization:

<aside> 🚧 Stochastic Optimization (SO)

Goal:

$$ \newcommand{\E}{\mathbb E}

\begin{aligned} \min_{w \in W} \; F_D(w) = \E_{z \sim D}[f(w,z)] \end{aligned} $$

given sample $S$ of $n$ examples $z_1, \ldots, z_n \overset{iid}{\sim} D$

</aside>

Previously:

This lecture: explore the statistical learning view to SO

Setup for this lecture

The canonical setting of statistical learning is essentially a slight abstraction of SO, where we allow for a generic “hypothesis class” (which is not necessarily represented as a subset of $\R^d$):

<aside> 🚧

Statistical learning:

Setup:

Goal:

</aside>