Eigenvectors and eigenvalues



Let $\bc{\A}$ be a square matrix. Multiplying a vector $\rc{\v}$ by $\bc{\A}$ results in another vector. If this vector points in the same direction as $\rc{\v}$, then we call $\rc{\v}$ an eigenvector of $\bc{\A}$.

As a simple visualization of eigenvectors, imagine yourself standing up, and turning about your vertical axis. While you do so point your right arm straight up, and your left arm forward. The rotation can be described by a $3 \times 3$ matrix. Your right arm doesn’t change direction as you rotate, so that’s an eigenvector of the rotation. Your left arm does change direction, so that’s not an eigenvector.


We require that $\rc{\v}$ be a nonzero vector. For $\rc{\v}$ to be an eigenvector of $\bc{\A}$, its length can change when we mulitply it by $\bc{\A}$, but its direction can’t. Another way of saying this is that multiplying $\bc{\A}$ by $\rc{\v}$ is the same as scaling the vector $\rc{\v}$ by some scalar, say $\bc{\lambda}$. This means we can define eigenvectors as follows. The vector $\rc{\v}$ is an eigenvector of $\bc{\A}$ if (and only if):

\[\bc{\A}\rc{\v} = \bc{\lambda}\rc{\v}\]

where $\bc{\lambda}$ is some scalar. If $\rc{\v}$ is an eigenvector, then the corresponding scalar $\bc{\lambda}$ is called its eigenvalue. We can also say that $\bc{\lambda}$ is an eigenvalue of $\bc{\A}$.

Note the following details about this definition:

Geometric intuition

From the definition, it’s not easy to see why eigenvectors and eigenvalues are such an important concept. To understand that properly, we’ll need to explore some places where the concept pops up.

Maximizing the quadratic

Eigenvalues of the covariance

Moving quantities around

In abstract linear algebra