# Documentation

### This is machine translation

Translated by
Mouseover text to see original. Click the button below to return to the English verison of the page.

To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

# polyeig

Polynomial eigenvalue problem

## Syntax

``e = polyeig(A0,A1,...,Ap)``
``````[X,e] = polyeig(A0,A1,...,Ap)``````
``````[X,e,s] = polyeig(A0,A1,...,Ap)``````

## Description

example

````e = polyeig(A0,A1,...,Ap)` returns the eigenvalues for the polynomial eigenvalue problem of degree `p`.```

example

``````[X,e] = polyeig(A0,A1,...,Ap)``` also returns matrix `X`, of size `n`-by-`n*p`, whose columns are the eigenvectors.```

example

``````[X,e,s] = polyeig(A0,A1,...,Ap)``` additionally returns vector `s`, of length `p*n`, containing condition numbers for the eigenvalues. At least one of `A0` and `Ap` must be nonsingular. Large condition numbers imply that the problem is close to a problem with repeated eigenvalues.```

## Examples

collapse all

Solve a quadratic eigenvalue problem involving a mass matrix `M`, damping matrix `C`, and stiffness matrix `K`. This quadratic eigenvalue problem arises from the equation of motion:

This equation applies to a broad range of oscillating systems, including a dynamic mass-spring system or RLC electronic network. The fundamental solution is , so both and `x` must solve the quadratic eigenvalue problem (QEP),

Create coefficient matrices `M`, `C`, and `K` to represent a mass-spring system with four-degrees-of-freedom. The coefficient matrices are all symmetric and positive semidefinite, and `M` is a diagonal matrix.

`M = diag([3 1 3 1])`
```M = 3 0 0 0 0 1 0 0 0 0 3 0 0 0 0 1 ```
`C = [0.4 0 -0.3 0; 0 0 0 0; -0.3 0 0.5 -0.2; 0 0 -0.2 0.2]`
```C = 0.4000 0 -0.3000 0 0 0 0 0 -0.3000 0 0.5000 -0.2000 0 0 -0.2000 0.2000 ```
`K = [-7 2 4 0; 2 -4 2 0; 4 2 -9 3; 0 0 3 -3]`
```K = -7 2 4 0 2 -4 2 0 4 2 -9 3 0 0 3 -3 ```

Solve the QEP for the eigenvalues, eigenvectors, and condition numbers using `polyeig`.

`[X,e,s] = polyeig(K,C,M)`
```X = Columns 1 through 7 0.1828 0.3421 0.3989 0.0621 0.3890 -0.4143 -0.4575 0.3530 -0.9296 0.3330 -0.8571 -0.6366 -0.2717 -0.4981 -0.5360 -0.0456 -0.1724 0.3509 -0.3423 0.1666 -0.5106 0.7448 0.1295 -0.8368 -0.3720 0.5712 0.8525 -0.5309 Column 8 0.4563 0.4985 0.5107 0.5315 ```
```e = -2.4498 -2.1536 -1.6248 2.2279 2.0364 1.4752 0.3353 -0.3466 ```
```s = 0.5813 0.8609 1.2232 0.7855 0.7012 1.2922 10.1097 10.0519 ```

Check that the first eigenvalue, `e(1)`, and first eigenvector, `X(:,1)`, satisfy the QEP equation. The result is close to, but not exactly, zero.

```lambda = e(1); x = X(:,1); (M*lambda^2 + C*lambda + K)*x```
```ans = 1.0e-13 * -0.1688 0.0933 0.0977 0.0111 ```

## Input Arguments

collapse all

Square coefficient matrices, specified as separate arguments. The matrices must all have the same order, `n`.

Data Types: `single` | `double`
Complex Number Support: Yes

## Output Arguments

collapse all

Eigenvalues, returned as a vector.

Eigenvectors, returned in the columns of a matrix. The first eigenvector is `X(:,1)`, the second is `X(:,2)`, and so on.

Condition numbers, returned as a vector. The condition numbers in `s` correspond to similarly located eigenvalues in `e`. Large condition numbers indicate that the problem is close to having repeated eigenvalues.

collapse all

### Polynomial Eigenvalue Problem

The polynomial eigenvalue problem is a variant of the standard eigenvalue problem, Ax = λx, but instead involves polynomials rather than linear terms.

As with the standard eigenvalue problem, the solution involves finding the eigenvalues and eigenvectors that satisfy the equation,

`$\left({A}_{0}+\lambda {A}_{1}+\dots +{\lambda }^{P}{A}_{p}\right)x=0\text{\hspace{0.17em}},$`

where the polynomial degree, `p`, is a nonnegative integer, and `A0,A1,...Ap` are square coefficient matrices of order `n`.

The most common form is the quadratic polynomial eigenvalue problem, which is

`$\left({A}_{2}{\lambda }^{2}+{A}_{1}\lambda +{A}_{0}\right)x=0\text{\hspace{0.17em}}.$`

One major difference between the quadratic eigenvalue problem and the standard (or generalized) eigenvalue problem is that there can be up to `2n` eigenvalues with up to `2n` right and left eigenvectors. In cases where there are more than `n` eigenvectors, the eigenvectors do not form a linearly independent set. See [1] and [2] for more detailed information about the quadratic eigenvalue problem.

## Tips

• `polyeig` handles the following simplified cases:

• `p = 0`, or `polyeig(A)`, is the standard eigenvalue problem, `eig(A)`.

• `p = 1`, or `polyeig(A,B)`, is the generalized eigenvalue problem, `eig(A,-B)`.

• `n = 0`, or `polyeig(a0,a1,...,ap)`, is the standard polynomial problem, `roots([ap ... a1 a0])`, where `a0,a1,...,ap` are scalars.

## Algorithms

The `polyeig` function uses the QZ factorization to find intermediate results in the computation of generalized eigenvalues. `polyeig` uses the intermediate results to determine if the eigenvalues are well-determined. See the descriptions of `eig` and `qz` for more information.

The computed solutions might not exist or be unique, and can also be computationally inaccurate. If both `A0` and `Ap` are singular matrices, then the problem might be ill-posed. If only one of `A0` and `Ap` is singular, then some of the eigenvalues might be `0` or `Inf`.

Scaling `A0,A1,...,Ap` to have `norm(Ai)` roughly equal to `1` might increase the accuracy of `polyeig`. In general, however, this improved accuracy is not achievable. (See Tisseur [3] for details).

## References

[1] Dedieu, Jean-Pierre, and Francoise Tisseur. “Perturbation theory for homogeneous polynomial eigenvalue problems.” Linear Algebra Appl. Vol. 358, 2003, pp. 71–94.

[2] Tisseur, Francoise, and Karl Meerbergen. “The quadratic eigenvalue problem.” SIAM Rev. Vol. 43, Number 2, 2001, pp. 235–286.

[3] Francoise Tisseur. “Backward error and condition of polynomial eigenvalue problems.” Linear Algebra Appl. Vol. 309, 2000, pp. 339–361.