madslynge has asked for the
wisdom of the Perl Monks concerning the following question:
Hey i need some help with maximizing a matrix
max (x) x'Dx
st. x'x=1
D is a symmetrical (n x n)matrix and x is an unknown (n x 1) vector
Re: Help with Matrix math! by ikegami (Pope) on Dec 16, 2007 at 18:56 UTC 
You could use PDL, an efficient math library for Perl. It can do matrix math with ease.
 [reply] 
Re: Help with Matrix math! by ww (Bishop) on Dec 16, 2007 at 18:24 UTC 
What, if anything, have you tried? Is there a Perl component to this question or did you mistake the Monastery for the Cloistered Towers of MathMonks?
and, please, don't doublepost your questions.
 [reply] 
Re: Help with Matrix math! by pc88mxer (Vicar) on Dec 16, 2007 at 21:16 UTC 
This sounds a lot like a homework problem, but I'll give you the following pointers:
 This is equivalent to finding the largest eigenvalue of D.
 A standard way of solving this problem is to use the "power method" which is documented here: wikipedia entry
 The perl module PDL is useful for performing matrix operations.
 [reply] [d/l] 
Re: Help with Matrix math! by randyk (Parson) on Dec 16, 2007 at 21:33 UTC 
If you set aside for a moment the constraint on the vector x, then the problem of maximizing
x_{i}D_{ij}x_{j}
(a sum over repeated indices is implied) translates, upon taking the derivative with respect to x_{i},
into the equation
D_{ij}x_{j} = 0.
This is a special form of the
eigenvalue equation for the matrix D, with the
eigenvalue being 0. In order for nontrivial solutions, the matrix D cannot have an inverse, so it's
determinant must vanish. Finding the eigenvectors can
be done with PDL; see
Mastering Algorithms with Perl for a discussion. The
Math::Cephes::Matrix module can also do this for real symmetric matrices. Generally, the components of the
eigenvectors found in this way are not completely determined; by convention, the normalization x_{i}x_{i} = 1 is imposed.
Update: The preceding needs to be augmented by a
test on the
Hessian of the matrix to determine what type of
critical point is present.
 [reply] 

In this case you have to take into account the constraint on the vector x. The quadratic form x_{i}D_{ij}x_{j} is unbounded on R^{n}, so a derivative test (even with an Hessian test) is only going to find local extreme points, and there is no guarantee that those points will lie on the surface of interest (i.e. x'x = 1.) The difference is that for a point to be a local maximum when x is constrained, the derivative only needs to be zero when evaluated in the tangent space of the constraining surface at that point, not zero in every direction.
The standard way to include the surface constraint is to use Lagrangian multipliers.
 [reply] 

I was thinking of the problem differently 
find the extremal points of
x^{†}Dx, which leads to the eigenvalue
equation Dx = 0. This equation doesn't determine
x^{†}x completely, so one is free to impose, for example, x^{†}x = 1 as a
normalization condition. But you're right that if
x^{†}x = 1 is intended as a true constraint, then a method like Lagrange multipliers should be used.
 [reply] 
Re: Help with Matrix math! by CountZero (Bishop) on Dec 16, 2007 at 20:12 UTC 
How would you do it by hand / paper & pencil? Do you apply a certain algorithm? If so, that algorithm can then most probably be translated into a computer program.
CountZero A program should be light and agile, its subroutines connected like a string of pearls. The spirit and intent of the program should be retained throughout. There should be neither too little or too much, neither needless loops nor useless variables, neither lack of structure nor overwhelming rigidity."  The Tao of Programming, 4.1  Geoffrey James
 [reply] 
Re: Help with Matrix math! by rg0now (Chaplain) on Dec 17, 2007 at 19:04 UTC 
What you are trying to solve is a nonlinearly constrained nonlinear (actually, quadratic) mathematical program. This is a tough one, I can tell you...
However, all is not lost, because you might be able to simplify your problem into something actually soluble.
First: are you absolutely sure that you want to optimize over the boundary of the unit ball (this is what the constraint x'x=1 seems to impose)? Isn't it enough to only assure that x is constrained into a sane region, like e.g., the ndimensional unit box? If it is, you get a quadratic program:
max_x (x' D x)
s.t. 1 <= x_i <= 1 \forall i=1..n
See more on quadratic programs here: Quadratic programming.
Second: all depends on whether or not your D is negative definite or not (all eigenvalues are negative). If not, you are out of luck: the problem is (at least) NP hard. If, on the other hand, D is nicely negative definite, then the objective function is concave and the uniqueness of the optimal solution is guaranteed. In this case, you could use a commercial quadratic solver, like CPLEX, or, after suitable linearization, a linear program solver. too. E.g., this one Math::GLPK has a nice Perl interface (believe me it's nice: I wrote it!...:).
 [reply] [d/l] [select] 

