Eigenvalues and eigenvectors play a central role in linear algebra, with wide applications in physics, engineering, and data science. They help understand the action of a linear transformation in a given vector space.
🔹 2. Basic Definitions
Let \(A\) be an \(n \times n\) square matrix. A non-zero vector \(\mathbf{v} \in \mathbb{R}^n\) is called an eigenvector of \(A\) if it satisfies:
\[A \mathbf{v} = \lambda \mathbf{v}\]Here:
- \(\lambda \in \mathbb{R}\) (or \(\mathbb{C}\)) is the eigenvalue corresponding to eigenvector \(\mathbf{v}\).
- \(\mathbf{v} \ne \mathbf{0}\) is a direction preserved under the transformation by \(A\), scaled by \(\lambda\).
🔹 3. How to Find Eigenvalues and Eigenvectors
Step 1: Characteristic Equation
To find eigenvalues, solve the characteristic equation:
\[\det(A - \lambda I) = 0\]- \(I\) is the identity matrix of the same size as \(A\).
- The determinant gives a polynomial in \(\lambda\) called the characteristic polynomial.
Step 2: Solve for Eigenvectors
For each eigenvalue \(\lambda\), solve the system:
\[(A - \lambda I) \mathbf{v} = 0\]to find the corresponding eigenvector(s) \(\mathbf{v}\).
🔸 4. Example
Let
\[A = \begin{bmatrix} 2 & 1 \\ 1 & 2 \end{bmatrix}\]Step 1: Find Eigenvalues
Solve:
\[\det(A - \lambda I) = \det \begin{bmatrix} 2 - \lambda & 1 \\ 1 & 2 - \lambda \end{bmatrix} = (2 - \lambda)^2 - 1 = 0\]So,
\[(2 - \lambda)^2 = 1 \Rightarrow \lambda = 1, 3\]Step 2: Find Eigenvectors
For \(\lambda = 1\):
\[(A - I) \mathbf{v} = \begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} = 0 \Rightarrow x + y = 0 \Rightarrow \mathbf{v}_1 = \begin{bmatrix} 1 \\ -1 \end{bmatrix}\]For \(\lambda = 3\):
\[(A - 3I) \mathbf{v} = \begin{bmatrix} -1 & 1 \\ 1 & -1 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} = 0 \Rightarrow x - y = 0 \Rightarrow \mathbf{v}_2 = \begin{bmatrix} 1 \\ 1 \end{bmatrix}\]🔹 5. Key Properties
- A matrix of size \(n \times n\) has at most \(n\) eigenvalues.
- Eigenvectors corresponding to distinct eigenvalues are linearly independent.
- If a matrix is symmetric, all its eigenvalues are real and eigenvectors are orthogonal.
🔹 6. Physical Interpretation
In physics:
- In quantum mechanics, eigenvalues of operators represent observable quantities.
- In mechanics, the normal modes of oscillation are eigenvectors of the system matrix.
📌 Summary
Term | Meaning |
---|---|
Eigenvalue | Scalar \(\lambda\) such that \(A \mathbf{v} = \lambda \mathbf{v}\) |
Eigenvector | Non-zero vector \(\mathbf{v}\) preserved in direction by \(A\) |
Characteristic Equation | \(\det(A - \lambda I) = 0\) to find eigenvalues |
Matrix Diagonalization | Possible if matrix has \(n\) linearly independent eigenvectors |