Eigenvalues and Eigenvectors — An Introduction
Understand eigenvalues and eigenvectors at a conceptual level. Learn the characteristic equation, how to compute eigenvalues for 2x2 matrices, and real-world applications.
Detailed Explanation
Eigenvalues and Eigenvectors
An eigenvector of a square matrix A is a non-zero vector v such that multiplying A by v simply scales v by a factor lambda (the eigenvalue):
A * v = lambda * v
Finding Eigenvalues
Rearranging: (A - lambda * I) * v = 0. For a non-trivial solution, the matrix (A - lambda * I) must be singular:
det(A - lambda * I) = 0
This is the characteristic equation. Its roots are the eigenvalues.
2x2 Example
A = | 4 1 |
| 2 3 |
det(A - lambda*I) = det | 4-lambda 1 |
| 2 3-lambda |
= (4-lambda)(3-lambda) - 2
= lambda^2 - 7*lambda + 10
= (lambda - 5)(lambda - 2)
Eigenvalues: lambda = 5 and lambda = 2
Properties
- The sum of eigenvalues equals the trace (sum of diagonal elements)
- The product of eigenvalues equals the determinant
- Real symmetric matrices have real eigenvalues
- Eigenvalues of A^(-1) are 1/lambda
- Eigenvalues of A^n are lambda^n
Geometric Interpretation
An eigenvector points in a direction that is unchanged by the transformation — only its magnitude (and possibly direction) changes. The eigenvalue tells you the scaling factor along that direction.
Use Case
Eigenvalues and eigenvectors are used in Principal Component Analysis (PCA) for dimensionality reduction, Google's PageRank algorithm (the dominant eigenvector of the link matrix), vibration analysis in mechanical engineering, quantum mechanics (energy states are eigenvalues of the Hamiltonian), and stability analysis of dynamical systems.