Welcome to Further Matrix Algebra!
Hi there! If you've made it to Further Pure 3, you're already a maths powerhouse. This chapter, Further Matrix Algebra, might seem abstract, but it's one of the most powerful tools in applied mathematics.
Don't worry if matrices felt tricky before. We are focusing on two main, fascinating ideas: how 3x3 matrices describe transformations in 3D space, and how we find special "stable directions" (Eigenvectors) that simplify everything.
Think of matrices as sophisticated instruction manuals for geometry. Let's dive in and unlock their secrets!
Section 1: Revisiting 3D Transformations
In FP1 and FP2, you mastered 2x2 matrices for 2D transformations (reflections, rotations, stretches). Now, we expand this into three dimensions using 3x3 matrices.
1.1 Understanding the 3x3 Transformation Matrix
A 3x3 matrix, \(\mathbf{A}\), transforms a 3D position vector \(\mathbf{r} = \begin{pmatrix} x \\ y \\ z \end{pmatrix}\) into a new position vector \(\mathbf{r}' = \mathbf{A}\mathbf{r}\).
The structure of the matrix is determined by where the three basis vectors are mapped:
- Column 1: Where \(\mathbf{i} = \begin{pmatrix} 1 \\ 0 \\ 0 \end{pmatrix}\) maps to.
- Column 2: Where \(\mathbf{j} = \begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix}\) maps to.
- Column 3: Where \(\mathbf{k} = \begin{pmatrix} 0 \\ 0 \\ 1 \end{pmatrix}\) maps to.
Key Takeaway: If you need to find the matrix for a transformation, just track where the unit axes vectors (\(\mathbf{i}\), \(\mathbf{j}\), \(\mathbf{k}\)) end up. Those new vectors form the columns of the transformation matrix.
1.2 Standard 3D Transformation Matrices
You must be familiar with standard transformation matrices. Here are a couple of essential examples:
Rotation about an Axis
When rotating in 3D, the axis of rotation remains fixed.
Example: Rotation about the z-axis by angle \(\theta\).
The z-axis is fixed, so \(\mathbf{k}\) maps to itself: \(\begin{pmatrix} 0 \\ 0 \\ 1 \end{pmatrix}\). The top 2x2 section handles the standard 2D rotation in the xy-plane.
\[ \mathbf{R}_z = \begin{pmatrix} \cos \theta & -\sin \theta & 0 \\ \sin \theta & \cos \theta & 0 \\ 0 & 0 & 1 \end{pmatrix} \]
Reflection in a Plane
A reflection matrix will flip the component perpendicular to the plane.
Example: Reflection in the \(xy\)-plane (where \(z=0\)).
\(\mathbf{i}\) and \(\mathbf{j}\) are in the plane, so they are fixed. \(\mathbf{k}\) maps to \(-\mathbf{k}\).
\[ \mathbf{M}_{xy} = \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & -1 \end{pmatrix} \]
Enlargement (Scaling)
Enlargement (or stretching) by a factor \(k\) from the origin:
\[ \mathbf{E} = \begin{pmatrix} k & 0 & 0 \\ 0 & k & 0 \\ 0 & 0 & k \end{pmatrix} \]
Quick Review: Geometric Interpretation
The determinant of a transformation matrix, \(\det(\mathbf{A})\), represents the scaling factor of the volume after transformation.
- If \(\det(\mathbf{A}) = 1\), the volume is unchanged (e.g., rotation).
- If \(\det(\mathbf{A}) = -1\), the volume is unchanged, but the orientation is reversed (e.g., reflection).
- If \(\det(\mathbf{A}) = k^3\), the volume is scaled by \(k^3\) (e.g., enlargement by factor \(k\)).
Section 2: Eigenvalues and Eigenvectors
This is the core concept of FP3 matrix algebra. It deals with finding special vectors that maintain their direction after a transformation, only being scaled by a factor.
Analogy: Imagine a piece of stretched rubber (the transformation). Most points move and change direction, but Eigenvectors are the lines drawn on the rubber that only get longer or shorter, staying on their original line.
2.1 Defining Eigenvalues and Eigenvectors
The relationship is defined by the equation:
\[ \mathbf{A}\mathbf{v} = \lambda\mathbf{v} \]
- \(\mathbf{A}\) is the matrix (usually 2x2 or 3x3).
- \(\mathbf{v}\) is the Eigenvector (the special direction).
- \(\lambda\) (lambda) is the Eigenvalue (the scaling factor).
Note: Eigenvectors cannot be the zero vector (\(\mathbf{v} \ne \mathbf{0}\)).
2.2 Step-by-Step: Finding Eigenvalues (\(\lambda\))
To find the scaling factors (\(\lambda\)), we rearrange the key equation:
\(\mathbf{A}\mathbf{v} = \lambda\mathbf{v}\)
\(\mathbf{A}\mathbf{v} - \lambda\mathbf{v} = \mathbf{0}\)
We must introduce the Identity Matrix, \(\mathbf{I}\), to factor out \(\mathbf{v}\) correctly:
\[ (\mathbf{A} - \lambda\mathbf{I})\mathbf{v} = \mathbf{0} \]
For non-trivial solutions (i.e., when \(\mathbf{v} \ne \mathbf{0}\)), the matrix \((\mathbf{A} - \lambda\mathbf{I})\) must be singular (non-invertible).
This leads to the Characteristic Equation:
\[ \det(\mathbf{A} - \lambda\mathbf{I}) = 0 \]
Procedure:
- Form the matrix \((\mathbf{A} - \lambda\mathbf{I})\): Subtract \(\lambda\) from all the leading diagonal elements of \(\mathbf{A}\).
- Calculate the Determinant: Calculate \(\det(\mathbf{A} - \lambda\mathbf{I})\). For a 3x3 matrix, this will result in a cubic polynomial in \(\lambda\).
- Solve the Equation: Solve the polynomial equation for \(\lambda\). These solutions are your Eigenvalues. (You may need to use the Factor Theorem and synthetic division if the eigenvalues are integers).
Common Mistake Alert: Students often forget to include the identity matrix \(\mathbf{I}\) when writing the characteristic equation. You cannot subtract a scalar (\(\lambda\)) from a matrix (\(\mathbf{A}\)) directly!
2.3 Step-by-Step: Finding Eigenvectors (\(\mathbf{v}\))
Once you have the eigenvalues (\(\lambda\)), you find the corresponding eigenvectors by substituting each \(\lambda\) back into the defining equation:
\[ (\mathbf{A} - \lambda\mathbf{I})\mathbf{v} = \mathbf{0} \]
Procedure:
- Select an Eigenvalue: Choose one of the calculated eigenvalues, \(\lambda_1\).
- Substitute: Substitute \(\lambda_1\) into \((\mathbf{A} - \lambda_1\mathbf{I})\).
- Set up the System: Let \(\mathbf{v} = \begin{pmatrix} x \\ y \\ z \end{pmatrix}\) and set up the system of linear equations resulting from \((\mathbf{A} - \lambda_1\mathbf{I})\mathbf{v} = \mathbf{0}\).
- Solve (Parametrically): Since the determinant is zero, the equations will be linearly dependent. You should only need two of the equations to solve for the ratios of \(x, y, z\).
Tip: Let one variable (often \(z\) or \(x\)) equal a parameter, say \(t\) or \(k\), and solve for the others in terms of that parameter. - Define the Eigenvector: Write the eigenvector \(\mathbf{v}\) in its simplest form (usually where the components are integers, by setting \(t=1\) or another suitable integer).
Did you know? If \(\mathbf{v}\) is an eigenvector, then any scalar multiple of \(\mathbf{v}\) (like \(3\mathbf{v}\) or \(-0.5\mathbf{v}\)) is also an eigenvector corresponding to the same eigenvalue \(\lambda\). We usually look for the simplest, non-zero integer representation.
Key Takeaway: Eigen Stuff
Eigenvalues (\(\lambda\)) are scalars found by solving the characteristic polynomial. Eigenvectors (\(\mathbf{v}\)) are the vectors found by substituting those scalars back into \((\mathbf{A} - \lambda\mathbf{I})\mathbf{v} = \mathbf{0}\).
Section 3: Diagonalisation of Matrices
Diagonalisation is the process of transforming a potentially complex matrix \(\mathbf{A}\) into a much simpler matrix \(\mathbf{D}\), which is a Diagonal Matrix (a matrix where all non-diagonal entries are zero).
Why do we care? Because diagonal matrices are incredibly easy to work with, especially when calculating powers.
3.1 The Diagonalisation Process
We use the eigenvectors and eigenvalues to construct two new matrices:
- The Modal Matrix (\(\mathbf{P}\)): This matrix contains the eigenvectors of \(\mathbf{A}\) as its columns.
- The Diagonal Matrix (\(\mathbf{D}\)): This matrix contains the corresponding eigenvalues of \(\mathbf{A}\) along its main diagonal, and zeros everywhere else.
The relationship that links these matrices is:
\[ \mathbf{D} = \mathbf{P}^{-1}\mathbf{A}\mathbf{P} \]
If you are asked to diagonalise \(\mathbf{A}\), you need to find \(\mathbf{P}\), \(\mathbf{D}\), and sometimes \(\mathbf{P}^{-1}\).
Crucial Step: Ordering Matters!
The order of the eigenvectors in \(\mathbf{P}\) must match the order of the eigenvalues in \(\mathbf{D}\).
If: \[ \mathbf{P} = \begin{pmatrix} | & | & | \\ \mathbf{v}_1 & \mathbf{v}_2 & \mathbf{v}_3 \\ | & | & | \end{pmatrix} \] Then: \[ \mathbf{D} = \begin{pmatrix} \lambda_1 & 0 & 0 \\ 0 & \lambda_2 & 0 \\ 0 & 0 & \lambda_3 \end{pmatrix} \] where \(\mathbf{v}_i\) is the eigenvector corresponding to eigenvalue \(\lambda_i\).
3.2 Using Diagonalisation to Calculate Matrix Powers (\(\mathbf{A}^n\))
This is the main application of diagonalisation. Calculating \(\mathbf{A}^{10}\) is painful. Calculating \(\mathbf{D}^{10}\) is trivial!
Start with the diagonalisation formula: \(\mathbf{D} = \mathbf{P}^{-1}\mathbf{A}\mathbf{P}\)
Rearrange to make \(\mathbf{A}\) the subject (multiply by \(\mathbf{P}\) on the left and \(\mathbf{P}^{-1}\) on the right): \[ \mathbf{A} = \mathbf{P}\mathbf{D}\mathbf{P}^{-1} \]
Now, let's find \(\mathbf{A}^2\):
\(\mathbf{A}^2 = (\mathbf{P}\mathbf{D}\mathbf{P}^{-1})(\mathbf{P}\mathbf{D}\mathbf{P}^{-1})\)
Since \(\mathbf{P}^{-1}\mathbf{P} = \mathbf{I}\) (the identity matrix), the middle terms cancel out:
\(\mathbf{A}^2 = \mathbf{P}\mathbf{D}(\mathbf{I})\mathbf{D}\mathbf{P}^{-1} = \mathbf{P}\mathbf{D}^2\mathbf{P}^{-1}\)
Generalising this for any power \(n\): \[ \mathbf{A}^n = \mathbf{P}\mathbf{D}^n\mathbf{P}^{-1} \]
Calculating \(\mathbf{D}^n\):
This is the easy part! If \(\mathbf{D}\) is a diagonal matrix, you simply raise each diagonal element to the power \(n\):
If \[ \mathbf{D} = \begin{pmatrix} a & 0 & 0 \\ 0 & b & 0 \\ 0 & 0 & c \end{pmatrix} \] Then \[ \mathbf{D}^n = \begin{pmatrix} a^n & 0 & 0 \\ 0 & b^n & 0 \\ 0 & 0 & c^n \end{pmatrix} \]
Therefore, to find \(\mathbf{A}^n\), you perform the three required matrix multiplications: \(\mathbf{P}\) multiplied by \(\mathbf{D}^n\), multiplied by \(\mathbf{P}^{-1}\).
Quick Review: Diagonalisation Steps
- Find all eigenvalues \(\lambda_i\).
- Find the corresponding eigenvectors \(\mathbf{v}_i\).
- Form the Modal Matrix \(\mathbf{P}\) (columns are \(\mathbf{v}_i\)).
- Form the Diagonal Matrix \(\mathbf{D}\) (diagonal entries are \(\lambda_i\), in matching order).
- Find the Inverse of the Modal Matrix, \(\mathbf{P}^{-1}\).
- Calculate \(\mathbf{A}^n\) using \(\mathbf{P}\mathbf{D}^n\mathbf{P}^{-1}\).
Summary and Final Encouragement
You have now tackled the most advanced concepts in this chapter: understanding 3D geometry using 3x3 matrices, discovering the special stability of Eigenvectors, and using these properties to simplify calculations via Diagonalisation.
Remember, the skill in this chapter isn't just calculation; it's understanding the meaning of \(\mathbf{A}\mathbf{v} = \lambda\mathbf{v}\) in terms of geometric transformation. Keep practicing finding those characteristic equations—they are the gateway to success! You've got this!