The notion of convexity will allow us to:
Convexity is the “great watershed of optimization”: in a very general sense, convex optimization problems are “easy”, while non-convex optimization problems are very often ”hard” (we have seen examples in the previous lecture).
Before we can define what it means for a function to be convex, we first have to introduce the concept of a convex set.
<aside> 💡 Definition: Convex set
A set $S \subseteq \R^d$ is convex iff
$$ \lambda x + (1-\lambda)y \in S ~,
\qquad \forall x,y \in S ~, 0 \leq \lambda \leq 1. $$
</aside>

The entire space: $S=\R^d$
Hyperplane:
$$ ⁍
$$
for given $a \in \R^d, b \in \R$ ($a \neq 0$)

Half-space:
$$ ⁍
$$
for given $a \in \R^d, b \in \R$ ($a \neq 0$)
