For convenience, start at the origin and move in a direction of v. The speed doesn't matter; we are moving along the line determined by v. We can assume v is a unit vector. The composition of f and vt gives a function in one dimension, the height of the surface as you move in the direction of v. A local minimum or maximum implies the same in any direction, and the derivative must be 0. Apply the chain rule, and ∇f.v has to equal 0. This holds for every v, and in particular, it holds for the coordinate vectors in n space. These in turn extract the partial derivatives. Therefore all partial derivatives must equal 0. This is a necessary (though not sufficient) condition for a local minimum or maximum.

If all partials are 0 then ∇f = 0, and ∇f.v = 0, and the surface is level in every direction. If you are an ant, sufficiently small, the surface looks like the plains of Nebraska. Of course it may be curving away, as the earth does; you just can't see it.

Let M be an n×n matrix whose entries are the mixed partials of f. In other words, the entry in row i, column j, is the second partial of f with respect to xi, with respect to xj. This is called the hessian matrix of f. Write the expression v*M*v, where v is a row vector on the left and a column vector on the right, and * is matrix multiplication. The result is the expression derived above, i.e. the second derivative. Evaluate M at 0 to find the second derivative of the path through the origin in the direction of v.

Taking advantage of the theorems in one dimension, a positive second derivative implies a local minimum, and a negative second derivative implies a local maximum. Yet v could be any unit vector, so how can we evaluate the sign of v*M*v?

Assume the mixed partials exist near the origin, and are continuous at the origin, hence the mixed partials are equal. This makes M a symmetric matrix.

At this point I refer you to the theory of quadratic forms, which says v*M*v is positive iff all the eigen values of M are positive, and v*M*v is negative iff all the eigen values of M are negative. This provides a minimum maximum test similar to the concavity test in one variable. An extremal point has to have first partials equal to 0. If the matrix of second partials has positive eigen values, the point is a local minimum. If the matrix of second partials has negative eigen values, the point is a local maximum. If some eigen values are positive and some are negative, we have a saddle point. If all eigen values are 0, we don't know. One can employ higher order derivative tests, just as we did in one dimension, but that gets very complicated very quickly.