in reply to Re: Help with Matrix math! in thread Help with Matrix math!
In this case you have to take into account the constraint on the vector x. The quadratic form x_{i}D_{ij}x_{j} is unbounded on R^{n}, so a derivative test (even with an Hessian test) is only going to find local extreme points, and there is no guarantee that those points will lie on the surface of interest (i.e. x'x = 1.) The difference is that for a point to be a local maximum when x is constrained, the derivative only needs to be zero when evaluated in the tangent space of the constraining surface at that point, not zero in every direction.
The standard way to include the surface constraint is to use Lagrangian multipliers.
Re^3: Help with Matrix math! by randyk (Parson) on Dec 18, 2007 at 01:41 UTC 
I was thinking of the problem differently 
find the extremal points of
x^{†}Dx, which leads to the eigenvalue
equation Dx = 0. This equation doesn't determine
x^{†}x completely, so one is free to impose, for example, x^{†}x = 1 as a
normalization condition. But you're right that if
x^{†}x = 1 is intended as a true constraint, then a method like Lagrange multipliers should be used.  [reply] 
