Symmetric and Orthogonal Matrices
Learn about symmetric matrices (A = A^T) and orthogonal matrices (Q^T*Q = I). Understand their special properties and importance in linear algebra applications.
Detailed Explanation
Symmetric Matrices
A matrix A is symmetric if A = A^T (it equals its own transpose). This means a[i,j] = a[j,i] for all entries.
Example: | 1 2 3 |
| 2 5 6 |
| 3 6 9 |
Properties of Symmetric Matrices
- All eigenvalues are real (even for complex-valued entries)
- Eigenvectors are orthogonal (for distinct eigenvalues)
- Can be diagonalized: A = Q * D * Q^T where Q is orthogonal and D is diagonal
- The Spectral Theorem guarantees this decomposition always exists
- Symmetric + positive definite means all eigenvalues > 0
Orthogonal Matrices
A square matrix Q is orthogonal if Q^T * Q = Q * Q^T = I. This means:
- Q^(-1) = Q^T (the inverse is simply the transpose)
- det(Q) = +/-1
- Columns are orthonormal vectors
- Rows are orthonormal vectors
Example: R(45) = | 0.707 -0.707 |
| 0.707 0.707 |
Properties of Orthogonal Matrices
- Preserve lengths: ||Qx|| = ||x|| for all x
- Preserve angles: the angle between Qu and Qv equals the angle between u and v
- Preserve dot products: (Qu)^T(Qv) = u^T * v
- Form a group: the product of orthogonal matrices is orthogonal
Connection
Symmetric matrices are diagonalized by orthogonal matrices. This relationship is the foundation of the Spectral Theorem and many decomposition algorithms (SVD, eigendecomposition).
Use Case
Symmetric matrices appear as covariance matrices in statistics, Hessian matrices in optimization, and adjacency matrices of undirected graphs. Orthogonal matrices represent rotations and reflections, are used in QR decomposition for solving least squares problems, and are central to the Gram-Schmidt process. Both types are fundamental building blocks in numerical linear algebra.