Emma's Final Year Project
Definition: A scalar λ is called an eigenvalue of the n × n matrix A is there is a . Similar matrices have at least one useful property, as seen in the following. (d) Prove that similar matrices have the same eigenvalues. What is the relation between their eigenvectors? (e) Which pairs of the following matrices are similar . Definition: A scalar λ is called an eigenvalue of the n × n matrix A is there is a . Similar matrices have at least one useful property, as seen in the following.
Sort of alike, but not quite equal. The notion of two matrices being row-equivalent is an example of an equivalence relation we have been working with since the beginning of the course see Exercise RREF. Row-equivalent matrices are not equal, but they are a lot alike.
For example, row-equivalent matrices have the same rank. Formally, an equivalence relation requires three conditions hold: We will illustrate these as we prove that similarity is an equivalence relation.
Transitive Here is another theorem that tells us exactly what sorts of properties similar matrices share. Proof So similar matrices not only have the same set of eigenvalues, the algebraic multiplicities of these eigenvalues will also be the same. However, be careful with this theorem.
It is tempting to think the converse is true, and argue that if two matrices have the same eigenvalues, then they are similar.
Eigenvalues and eigenvectors - Wikipedia
Not so, as the following example illustrates. For example, the eigenvalues of the matrix are the entries on the diagonal of the diagonal matrix. With an equivalence about singular matrices we can update our list of equivalences about nonsingular matrices. The following are equivalent.
- Eigenvalues and eigenvectors
- Similar Matrices Have the Same Eigenvalues
Unfortunately, there are not parallel theorems about the sum or product of arbitrary matrices. But we can prove a similar result for powers of a matrix. Proof While we cannot prove that the sum of two arbitrary matrices behaves in any reasonable way with regard to eigenvalues, we can work with the sum of dissimilar powers of the same matrix.
We have already seen two connections between eigenvalues and polynomials, in the proof of Theorem EMHE and the characteristic polynomial Definition CP. Our next theorem strengthens this connection.
Example BDE Building desired eigenvalues Inverses and transposes also behave predictably with regard to their eigenvalues. Proof The proofs of the theorems above have a similar style to them. They all begin by grabbing an eigenvalue-eigenvector pair and adjusting it in some way to reach the desired conclusion. Above, it was shown that the eigenvectors of A form a basis. Say that matrix P is a matrix with columns consisting of these n independent eigenvectors of A, i.
Finding eigenvectors of similar matrices
Since all vectors vi are linearly independent, P is non-singular and Hence A is diagonalisable. On the other hand if A is diagonalisable then it has linearly independent eigenvectors.
Indeed, D clearly has n independent eigenvectors, the basis vectors e1, e2, Previously we have shown this implies that P-1ej is an eigenvector of A.
Since P is invertible all the vectors P-1ej are linearly independent.
Note that if all eigenvalues are different then the eigenvectors are linearly independent. Hence in this case it is easy to see that the matrix is diagonal.
Here is an example of a diagonlisable matrix. Here is an example of a non-diagonlisable matrix.