Sunday, 25 August 2013

Using gradient descent and Newton's method combined

Using gradient descent and Newton's method combined

I have this function f(X) where X=A+B+C where A is a diagonal element with
variable a on its diagonal. B is another diagonal matrix with variable b
on its diagonal. C is a positive definite matrix with variable $c_{ij}$.
Now I want to optimize this function over these variables. I was wondering
to keep A and B constant first optimize over C using Newton's method. Then
I will keep C constant and optimize over a and b using gradient descent.
I am not sure if this is applicable or will work. Suggestions guys?

No comments:

Post a Comment