Polynomial eigenvalue problem
[X,e] = polyeig(A0,A1,...Ap)
e = polyeig(A0,A1,..,Ap)
[X, e, s] = polyeig(A0,A1,..,AP)
where polynomial degree p is a non-negative integer, and A0,A1,...Ap are input matrices of order n. The output consists of a matrix X of size n-by-n*p whose columns are the eigenvectors, and a vector e of length n*p containing the eigenvalues.
If lambda is the jth eigenvalue in e, and x is the jth column of eigenvectors in X, then (A0 + lambda*A1 + ... + lambda^p*Ap)*x is approximately 0.
[X, e, s] = polyeig(A0,A1,..,AP) also returns a vector s of length p*n containing condition numbers for the eigenvalues. At least one of A0 and AP must be nonsingular. Large condition numbers imply that the problem is close to a problem with multiple eigenvalues.
Based on the values of p and n, polyeig handles several special cases:
p = 0, or polyeig(A) is the standard eigenvalue problem: eig(A).
p = 1, or polyeig(A,B) is the generalized eigenvalue problem: eig(A,-B).
n = 1, or polyeig(a0,a1,...ap) for scalars a0, a1 ..., ap is the standard polynomial problem: roots([ap ... a1 a0]).
If both A0 and Ap are singular the problem is potentially ill-posed. Theoretically, the solutions might not exist or might not be unique. Computationally, the computed solutions might be inaccurate. If one, but not both, of A0 and Ap is singular, the problem is well posed, but some of the eigenvalues might be zero or infinite.
Note that scaling A0,A1,..,Ap to have norm(Ai) roughly equal 1 may increase the accuracy of polyeig. In general, however, this cannot be achieved. (See Tisseur  for more detail.)
 Dedieu, Jean-Pierre Dedieu and Francoise Tisseur, "Perturbation theory for homogeneous polynomial eigenvalue problems," Linear Algebra Appl., Vol. 358, pp. 71-94, 2003.
 Tisseur, Francoise and Karl Meerbergen, "The quadratic eigenvalue problem," SIAM Rev., Vol. 43, Number 2, pp. 235-286, 2001.
 Francoise Tisseur, "Backward error and condition of polynomial eigenvalue problems" Linear Algebra Appl., Vol. 309, pp. 339-361, 2000.