<< Chapter < Page | Chapter >> Page > |
When the objective function and constraints , , it is easy to check whether an extremum is a maximum or a minimum of the functional. We appeal to second-order differentials, known as Hessians.
Definition 1 The Hessian matrix for the functional has entries
Lemma 1 Let be the Hessian of the Lagrangian and let be an extremum. If for all , then is a minimizer. If for all , then is a maximizer.
Example 1 Find the extremum of subject to and determine whether it is a maximum or a minimum.
To begin, we write the optimization's equality constraint:
The objective function can be written in the form
where the matrix has entries given by . Thus, for our example the resulting matrix is
The gradient for this function is given by
where . We can also rewrite the inequality constraint as , where denotes a vector with entries equal to one of appropriate size. Therefore, its gradient is equal to The resulting gradient of the Lagrangian is set to zero to obtain the solution:
Solve for from to obtain . Therefore, the optimization's solution is
We can solve for the Hessians of and :
We therefore obtain that the Hessian of the Lagrangian is equal to
At this point we need to check if the product is positive or negative for all , the tangent space defined as
It is easy to see that if and only if . To begin, we check whether the eigenvalues of are all positive or negative: a calculation returns . Since neither case occurred, we have to specifically consider the case in which :
It turns out that we can find for which the value on the left hand side may be positive or negative. Therefore, this is neither a maximum or a minimum, and we have found an inflection point.
Notification Switch
Would you like to follow the 'Signal theory' conversation and receive update notifications?