The conjugate gradient method aims to solve a system of linear equations, Ax=b, where A is symmetric, without calculation of the inverse of A. It only requires a very small amount of membory, hence is particularly suitable for large scale systems.
It is faster than other approach such as Gaussian elimination if A is well-conditioned. For example,
A=U*diag(s+max(s))*U'; % to make A symmetric, well-contioned
Conjugate gradient is about two to three times faster than A\b, which uses the Gaissian elimination.
Could anybody tell me how to evaluate the time required by the SVD. Please kindly note the svd.m called directly by the workspace is a built-in function, which is much faster than the svd.m available on the Matlab website. So which one is more suitable to evaluate the required time?
Actually, I am trying to compare a new algorithm with the SVD in computational cost or time. I suppose the built-in SVD function might be faster than the source-code SVD function. Could anybody help me? Thanks a lot for your kindly helps in advance! My email address is email@example.com.
A\b uses cholesky factorization if A is simmetric. It swithes to gauss method when A is not simmetric and the factorization fails due to negative sqrt input.
I don't know if it is present in all version, but there already is a conjgrad method in matlab.
Anyway thank to Cao, submissions that works are always usefull, in the worst case you can study the algorithm.
To consider two trival cases.
change initial value to x=b. slightly faster.