Optimization problems are arguably universal, in the sense that almost any problem in engineering and life can be cast as an optimization problem (but not e.g. a root-finding problem or an ODE!). In this module we discuss the character of these problems and their solutions (local and global). We focus on gradient-based solutions methods - i.e. methods using the vector to decide in which direction to search. Such methods can be applied effectively in very high-dimensional settings, e.g. with millions of unknowns. This module must be considered a modest introduction to numerical optimization - we end with a overview of the landscape of methods.