Lecture from 23.10.2024 | Video: Videos ETHZ

Gauss-Jordan Elimination

Gauss-Jordan elimination is an extension of Gaussian elimination that reduces a matrix to reduced row echelon form (RREF). This form simplifies solving linear systems and determining properties like rank and linear independence. First let us define REF and RREF.

REF and RREF

Let be an matrix. is in row echelon form (REF) if the following holds: There exist column indices such that:

  1. For , we have (1’s in gray).
  2. For all , we have whenever:
    • (completely white rows) or
    • (partially white rows) or
    • (0’s in gray) for all .

If , is in reduced row echelon form (RREF) (no completely white rows).

You can denote these matrices by:

  • REF()
  • RREF().

Note that naturally you can’t convert a tall and skinny matrix into RREF, since you have more columns than matrices.

Another thing you can observe immediately is: REF() has a rank of , since the columns are independent.

Direct Solution of Linear Systems in REF

Direct solution: If is in REF() (rows are zero)

  • If for some : no solution! (since any linear combination of the column vectors of will return a in those columns)
  • Otherwise:

Our trivial solution, called canonical solution is when we simply put in in the right spot. Note that there might be other solutions too. You could technically put any number into x at all the spots where .

Solution Set

When we have a matrix in REF, it’s easy to see the solution (or lack thereof).

  • Unique Solution: If is the identity matrix, .
  • Infinite Solutions: If has a row of zeros, and the corresponding entry in is also zero, there are infinitely many solutions (free variables).
  • No Solution: If has a row of zeros, but the corresponding entry in is non-zero, the system is inconsistent and has no solution.

Gauss-Jordan Elimination Algorithm

The Gauss-Jordan algorithm extends Gaussian elimination by performing the following steps:

  1. Forward Elimination (Gaussian Elimination): Transform the matrix into row echelon form (REF) using the same steps as Gaussian elimination. This involves creating pivots (leading 1s) and using them to eliminate entries below.
  2. Backward Elimination: Starting from the last pivot, eliminate entries above the pivot using row operations. This step transforms the matrix from REF to RREF.

Example

Let’s use Gauss-Jordan elimination to transform the following matrix into RREF:

  1. Forward Elimination:
    • Subtract 2 times row 1 from row 2:
    • Subtract 3 times row 1 from row 3:
    • Subtract row 2 from row 3:
    This is now in REF.
  2. Backward Elimination:
    • Subtract 3 times row 2 from row 1:

This is now in RREF.

Properties and Applications

  • Rank: The rank of a matrix is equal to the number of non-zero rows (or pivots) in its RREF.
  • Linear Independence: The columns of a matrix are linearly independent if and only if the RREF has a pivot in every column.
  • Inverse Calculation: Gauss-Jordan elimination can be used to calculate the inverse of a matrix (see Calculating the Inverse).

Properties of REF

Invertibility and REF

Theorem 3.21: Let be an matrix. There exists an invertible matrix such that is in REF.

Proof: Every elementary row operation is equivalent to multiplying by an invertible elementary matrix. Since REF can be reached through a sequence of elementary row operations, can be obtained by multiplying by the product of those invertible matrices.

Important Note: The theorem doesn’t guarantee to be in RREF, only in REF.

Connection to Solving Linear Systems: The matrix represents the sequence of row operations used to transform into REF. If we apply the same sequence of row operations to the augmented matrix , we get . Since is in REF, we can now directly solve the system using the method described above. This gives us the same solution as the original system .

It’s quite similar to the usefulness of LU decomposition. See: [[10 Calculating the Inverse, LU and LUP Decomposition#solving-ax—b-with-lu|Solving with LU]]

Independence and REF

Theorem 3.22: Let be an matrix, an invertible matrix, and in REF(). Then has independent columns .

Proof:

  • Column of is dependent there is in : and for . This means that column can be written as a linear combination of previous columns.
  • The same condition is true for : .
  • Since and have the same solutions, and have the same (in)dependent columns. has independent columns because they are leading columns in REF.

Important Note: This theorem applies to REF, not RREF.

An interesting consequence of this theorem is that if is invertible, then by the Inverse Theorem (Inverse Theorem (Theorem 3.11)), we know all columns are independent and and .

CR decomposition

Theorem 3.24: Let be an matrix; how to combine the independent columns; how to get all columns.

  • is in REF(). Then:
    • = the first rows of (the nonzero rows of ).
    • = columns of (the independent columns of )

Proof: .

  1. is the product of elementary row operation matrices that transform into .
  2. has columns of (the independent ones by Lemma 3.22).
  3. has columns of : the unit vectors .

Visualization:

  • : an matrix consisting of the identity matrix () in the top rows and zeros below.
  • : an matrix with the remaining rows of .

Key Points:

  • The CR decomposition provides a way to express an arbitrary matrix in terms of its independent columns and the corresponding unit vectors.
  • It allows us to understand the relationship between the columns of and its REF.

Continue here: 12 Vector Spaces, Subspaces