# 2.1 Vectors and Linear Equations

## Abstract

Last chapter we have learned something about linear algebra — vectors, linear combinations and something else. Although they are easy，they are the crucial concepts of linear algebra. This post we are going to talk about how to solve a system of equations. And I have to say, this is the central problem of linear algebra.

**keywords**: *Linear Equations, Row Picture, Column Picture, Coefficient Matrix, Elimination, Matrix Form of Equation, Identity Matrix*.

## The Map of Concepts

## Row Picture and Column Picture

What is the main usage of Linear algebra? The answer will always be **to solve a system of equations**. And ** Linear algebra can only solve Linear Equations**. The equations are linear when all unknowns are multiplied by numbers, no forms like $x\times y$. Here is an example of a system of linear equations, which is “two equations, two unknowns”:

$$

\begin{aligned}

x-2y&=1\newline

3x+2y&=11

\end{aligned}\tag{2.1.1}

$$

### Row Picture

Look at a row of the equations, for instance, the first row $x-2y=1$. Every $(x,y)$ that solves the equation is on a line in Euclidean plane (if there are more than 3 unknowns, it in Euclidean space). This line is :

Make sure that every points on the line can solve the equation $x-2y=1$. Then we draw another equation $3x+2y=11$

The intersection point $(3,1)$, where the two lines meet, is on both lines. So, $(3,1)$ is the solution of both the equations, that is to say that this point is the solution of the system of equations.

Row Picture: The row picture shows two lines meeting at a single point (the solution)

### Column Picture

Viewing the same equations at a different point will bring us something new. We can rewrite the equantions in a “linear combination” way:

$$

x\begin{bmatrix}1\newline 3\end{bmatrix}+

y\begin{bmatrix}-2\newline 2\end{bmatrix}=

\begin{bmatrix}1\newline 11\end{bmatrix}\tag{2.1.2}

$$

That is a linear combination of two vectors: $\vec{v}=\begin{bmatrix}1\newline 3\end{bmatrix}$ and $\vec{w}=\begin{bmatrix}-2\newline 2\end{bmatrix}$. And we may not forget that $\begin{bmatrix}1\newline 11\end{bmatrix}$ is the vector $\vec{b}$ and so it is converted to the problem in (1.3 Introduction to Matrices), which is to calculate $\vec{x}$ with known $\vec{b}$.

Although I don’t like drawing a picture for vectors, but I have to draw this time to make the column picture more clear:

1. draw $\vec{v}=(1,3)$ , $\vec{w}=(-2,2)$ and $\vec{b}=(1,11)$

2. Scalar multiplying $\vec{v}=(1,3)$ by 3

3. Scalar multiplying $\vec{w}=(-2,2)$ by 1, and add the result to $3\vec{v}=(3,9)$, so we get:

Aha, perfect work, we get a new vector through linear combinating two different(direction) 2-dimension vectors. So, we have:

$$

3\begin{bmatrix}1\newline 3\end{bmatrix}+\begin{bmatrix}-2\newline 2\end{bmatrix}=\begin{bmatrix}1\newline 11\end{bmatrix}\tag{2.1.3}

$$

and $x=3$, $y=1$, of course, is same as row picture.

## Coefficient Matrix, Matrix and Linear Equations

We can rewrite (2.1.2) into a “matrix multiply by vector” form, like this:

$$

A\vec{x}=\vec{b}\tag{2.1.4}

$$

where $A=\begin{bmatrix}1&-2\newline 3&2\end{bmatrix}$ is the **coefficient matrix**, $\vec{x}=\begin{bmatrix}x\newline y\end{bmatrix}$ and $\vec{b}=\begin{bmatrix}1\newline 11\end{bmatrix}$, so the problem that’s to solve equations has been converted into how to recover $\vec{x}$ from $\vec{b}$ which we have talked last post.

Then we get a **matrix equations**:

$$

\begin{bmatrix}1&-2\newline 3&2\end{bmatrix}\begin{bmatrix}x\newline y\end{bmatrix}=\begin{bmatrix}1\newline 11\end{bmatrix}\tag{2.1.5}

$$

If the coefficient matrix is not a square matrix ( $n\times n$ ), when there are more unknowns than equations or more equations than unknowns, to solved that we will develop some more concepts, such as, matrix multiplication, inversion, invertible matrices, etc.

### Elimination

Elimination is a foundamental way to solve equantions, and it’s also an crucial idea for solving equations with matrices. Let review what we did to solve equations by eliminating in high school:

1. equations are:

$$

\begin{eqnarray}

x-2y=1 \tag{2.1.1a} \newline

3x+2y=11 \tag{2.1.1b}

\end{eqnarray}

$$

2. Multiplication: $(2.1.1.a)\times 3$ and get:

$$

\begin{eqnarray}

3x-6y=3 \tag{2.1.1c} \newline

3x+2y=11 \tag{2.1.1b}

\end{eqnarray}

$$

- Substraction: $(2.1.1.b) – (2.1.1.c)$:

$$

\begin{eqnarray}

3x-6y=3 \tag{2.1.1c} \newline

0x+8y=8 \tag{2.1.1d}

\end{eqnarray}

$$ - Then we get $y=1$ and substitute $y$ back into (2.1.1c) or (2.1.1a) and we get $x=3$

Then the problem is solved, and we get an interesting matrix

$$

\begin{aligned}

3x-6y&=3 \newline

0x+8y&=8

\end{aligned}\tag{2.1.6}

$$

its coefficient matrix become:

$$

U=\begin{bmatrix}3&-6\newline 0&8\end{bmatrix}

$$

It’s a upper triangle matrix.

Here we list four steps to understand elimination by matrices:

- Elimination goes form $A$ to a triangular $U$ by a sequence of matrix step $E_{ij}$
- The inverse matrices $E^{-1}_{ij}$ in reverse order bring $U$ back to the original $A$
- In matrix language that reverse order is $A=LU$=(lower triangle)(upper triangle)
- Elimination succeeds if $A$ is invertible(It may need row exchanges)

The coefficient matrix of equantions will not alway be $n\times n$, they can be $m\times n$, and that makes the problem more complicated. However, $A\vec{x}=0$ which $A$ is square may also have more than one condition. And all these situations will lead us to some more useful concepts, such as: those solutions will go into a *vector space*, the *rank* of $A$ , the *dimension* of the vector spaces and so on.

## The Matrix Form of the Equations

Coefficient matrix is a matrix that is filled with coefficients before the unknowns, if the equantion is three equantions in three unknowns:

$$

\begin{aligned}

1x+2y&+3z&=6\newline

2x+5y&+2z&=4\newline

6x-3y&+1z &=2

\end{aligned}

$$

Coefficient matrix $A=\begin{bmatrix}1&2&3\newline 2&5&2\newline 6&-3&1 \end{bmatrix}$ the unknowns vector $\vec{x}=\begin{bmatrix}x\newline y \newline z\end{bmatrix}$ and here we can also have two viewpoints:

1. Row Picture:

$$

A\vec{x}=

\begin{bmatrix}

\text{(row1)}\cdot\vec{x}\newline

\text{(row2)}\cdot\vec{x}\newline

\text{(row3)}\cdot\vec{x}\newline

\end{bmatrix}

$$

- Column Picture:

$$

A\vec{x}=\vec{x}\text{ column 1 }+\vec{x}\text{ column 2 }+\vec{x}\text{ column 3 }

$$

They are good, but from now on, all $A\vec{x}$ in my blog are considered as a linear combination of the columns of the matrix $A$

### Identity Matrix

We have a special number (maybe two) in multiplication, and it is “1”. Every number that is multiplyed by 1 is still itself. In Matrix, we also have a “i” matrix:

$$

I=\begin{bmatrix}

1&\dots&0\newline

\vdots&\ddots&\vdots\newline

0&\dots&1

\end{bmatrix}

$$

The numbers on main diagonal are all ones,however, else positions are all zeroes. Its notation is $I$, we call it “Identity Matrix”. Multiplying Identity matrix by any vector(matrix) will give back the vector (matrix):

$$

I\vec{x}=\vec{x}

$$

## Matrix Notation

The notation of a matrix is a block of numbers, every element of the block has an individual index, for instance, $a_{ij}$ means the entry who position is row $i$ column $j$. And so a $2\times 2$ matrix can be notated as:

$$

A=\begin{bmatrix}

a_{11}&a_{12}\newline

a_{21}&a_{22}

\end{bmatrix}=

\begin{bmatrix}

A(1,1)&A(1,2)\newline

A(2,1)&A(2,2)

\end{bmatrix}

$$

If a matrix is $m\times n$，then its entry $a_{ij}$ must have the boundaries: $0< i\leq m$ and $0< j\leq n$

## Conclusion

- The basic operation on vectors are multiplication $c\vec{v}$ and vector addition $\vec{v}+\vec{w}$
- Linear combination
- Matrix-vector multiplication $A\vec{x}$ can be computed by dot product, a row a time.
- $A\vec{x}$ should be understood as a combination of the columns fo $A$
- Column picture $A\vec{x}=\vec{b}$ asks for a combination of columns of $A$
- Row picture: Each equation gives a line, a plane, a hyperplane. They intersect at the solution or solutions, if any.

## Reference

1.Strang G, Strang G, Strang G, et al. Introduction to linear algebra[M]. Wellesley, MA: Wellesley-Cambridge Press, 1993.

## Leave a Reply