In this section we'll revisit systems of linear equations. You've see them before: Two linear equations with two unknowns, say **x** and **y**, where you're trying to find the point of intersection (if there is one). Or you've seen three equations and three unknowns problems (**x**, **y** and **z**, or **x _{1}**,

Each equation is a piece of information, and we always need **n** unique pieces of information to solve for **n** variables.

It's actually not uncommon to need to solve very large systems of equations, even with hundreds of variables. Using matrices, we'll work toward much simpler – and much more expandable – methods to do that.

Let's solve this system "the old-fashioned way":

One way to do this (and it's not the *only* way) is to solve for one variable in one of the equations, we'll try **z** in the first equation here,

... then replace **z** with the result (-2x + y + 3) in the second and third equations. We've thus eliminated **z** from the second two equations and reduced the 3-equations / 3 unknowns problem to a two-equations / 2 unknowns problem:

These reduce to:

Conveniently, the value of **x** falls right out of the second equation, and from that we can easily solve for y using -5x + 5y = 5, then z using any of the original three equations:

Now let's see how matrices can make solving systems a little easier. We'll take that same system of equations,

and write it in the form of an **augmented matrix**, which looks like this:

The augmented matrix is just a block of numbers that are the coefficients of **x**, **y** and **z** from the left side of the = sign, and a column of numbers from the right side of the = sign that we'll call the **result vector**. Now how to solve the system using this weird matrix?

Our goal will to be to make the 3×3 matrix on the left side of the line **upper triangular**. Recall that an upper-triangular matrix has only zeros below the diagonal. You'll see how it works if you keep going. We'll accomplish that goal by applying **elementary row operations**.

Elementary row operations include adding a multiple of one row of a matrix or an augmented matrix to another or multiplying an entire row by a constant.

Here is our augmented matrix again (below), and right next to it, a note about what the first elementary row operation will be. "**R3+R2**" means "*add Row 2 to Row 3 and replace Row 3 with the result*."

There's nothing special about the "**R3+R2**" notation. It's how *I* remember what *I'm* doing, but you might come up with your own notation. The aim of this operation is to get rid of that -1 in the lower-left corner of the matrix. Here's the result:

We're on our way to an upper-triangular 3×3. That targeted matrix element is now zero, so we'll do the next operation, **2R2-R1**, which means "*multiply row 2 by 2, subtract row 1 from it, and place the result back in row 2*." The aim this time is to turn the leading 1 in the second row into a zero. Here's the result of that operation:

The next operation, **5R3-3R2**, is designed to zero out the second element (3) of the third row. It means "*multiply row 3 by 5, subtract from it 3 times row 2, and place the result back in row 3*." Here's the result:

Now this system is essentially solved because it's in upper-triangular form. The solutions are simple; we just start with **z**, plug it into the next equation to get y, and so on:

The process of making the matrix upper-triangular through a series of elementary row operations is called **Gaussian elimination** (named for mathematician Carl Friedrich Gauss).

In that last example, we used elementary row operations (Gaussian elimination) to reduce the matrix to upper-triangular form, and it was basically solved, except for a couple of algebra moves. We can go further, though, to make an identity matrix out of the left side, in which case the remaining result vector on the right of the augmented matrix will be the solution (**x, y, z**) with no extra algebra needed.

That technique is called **Gauss-Jordan elimination**, and here's how that might work. We'll pick up from our upper-triangular matrix:

The operation "**R2+R3**" is designed to zero-out the last 5 in the second row of the 3 × 3 matrix. The result is below, and to that we'll add an operation to zero out the 1 in the 3^{rd} position of the first row.

Next we'll zero out the remaining non-diagonal element:

The result is a diagonal matrix, and we only have to divide each row by its diagonal element to finish:

Here's the result, and notice that the vector on the right contains our expected result, **x** = 1, **y** = 2 and **z** = 3.

While Gauss-Jordan elimination isn't strictly needed to solve a system of equations, it's certainly a good goal because the solution just falls right in your lap. In other sections we'll find even better ways of solving systems that are much larger than 3-dimensional.

Hopefully you can see that solving some linear systems, especially ones with more equations and more unknowns, can be made easier with Gaussian or Gauss-Jordan elimination using augmented matrices. There are practice problems below, and you can look at some video problem solving examples (magenta bar below).

In sections ahead, you'll learn two other important methods of solving linear systems with matrices: The inverse matrix method, and Cramer's rule.

1. | |

2. | |

3. | |

4. |

5. | |

6. | |

7. | |

8. |

**xaktly.com** by Dr. Jeff Cruzan is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License. © 2016, Jeff Cruzan. All text and images on this website not specifically attributed to another source were created by me and I reserve all rights as to their use. Any opinions expressed on this website are entirely mine, and do not necessarily reflect the views of any of my employers. Please feel free to send any questions or comments to jeff.cruzan@verizon.net.