# 1.3 Introduction to Matrices

## Abstract

Vectors are the element of linear algebra, and we can consider vectors and their linear combination as the roots of linear algebra. However, if we solved all the problem with vectors, the progress would become cumbersome, so we need something more deft, and now let me present **matrix**, who is the most used object in linear algebra. And in this post another concepts, independence and dependence, about vector will be described.

**keywords**: *Linear Equantions, Vectors, Matrix, Independence, Dependence, Cyclic Differences, Singular, Invertible*

## The Map of Concepts

## Build a Matrix

Some of us may have seen matrices before,and they are a block filled with numbers. Sometimes they are smale, we often come across a 2 by 2 matrix or 3 by 3 matrix in some textbooks, but sometimes they are huge, may be millions by millions or even biger in applications. We can never definite a new definition without any foundations, and we definite “Matrix” from vectors, like below. When we have three vectors, and they are:

$$

\begin{aligned}

\vec{u}&=\begin{bmatrix}1\newline -1 \newline 0\end{bmatrix}\newline

\vec{v}&=\begin{bmatrix}0\newline 1\newline -1\end{bmatrix}\newline

\vec{w}&=\begin{bmatrix}0\newline 0 \newline 1\end{bmatrix}

\end{aligned}\tag{1.3.1}

$$

I have to say that these three guys are chosen carefully, and they will be used soon. And according to the definition of linear combination, we can get a new three-demsions vector:

$$

\begin{aligned}

&c\vec{u}+d\vec{v}+e\vec{w}\newline

&=c\begin{bmatrix}1\newline -1 \newline 0\end{bmatrix}+d\begin{bmatrix}0\newline 1\newline -1\end{bmatrix}+e\begin{bmatrix}0\newline 0 \newline 1\end{bmatrix}\newline

&=\begin{bmatrix}c\newline d-c \newline e-d\end{bmatrix}

\end{aligned}\tag{1.3.2}

$$

Now, we rewrite that combination using a matrix. The vectors , $\vec{u},\vec{v},\vec{w}$ go into the columns of matrix $A$, and the numbers $c,d,e$ go into a vector (we can call it vector $\vec{x}$). Then the linear combination is converted to a matrix “times” a vector:

$$

\begin{bmatrix}

1&0&0\newline

-1&1&0\newline

0&-1&1

\end{bmatrix}

\begin{bmatrix}

c\newline

d\newline

e

\end{bmatrix}=

\begin{bmatrix}c\newline d-c \newline e-d\end{bmatrix}\tag{1.3.3}

$$

So, we can now make sure that the matrix we will use are translated from vectors, and we can also get a new calculation — matrix times vector:

$$

Ax=

\begin{bmatrix}

&&\newline

\vec{u}&\vec{v}&\vec{w}\newline

&&

\end{bmatrix}

\begin{bmatrix}

c\newline

d\newline

e

\end{bmatrix}=

c\vec{u}+d\vec{v}+e\vec{w}\tag{1.3.4}

$$

These bring us not just a new definition but a crucial change in viewpoint: at the begining we multiplyed the vectors（$c\vec{u}+d\vec{v}+e\vec{w}$）by those numbers ($c,d,e$) , while now, it change to the matrix $A$ times the vector $\vec{x}$ — that is a very important conclusion, the result of $A\vec{x}$, $\vec{b}$ , is the linear combination of the columns of matrix $A$

We can use a more general vector $\vec{x}$, in which $\vec{x}=\begin{bmatrix}x_1\newline x_2\newline x_3\end{bmatrix}$, to see more details:

$$

\begin{bmatrix}

1&0&0\newline

-1&1&0\newline

0&-1&1

\end{bmatrix}

\begin{bmatrix}

x_1\newline

x_2\newline

x_3

\end{bmatrix}=

\begin{bmatrix}x_1\newline x_2-x_1 \newline x_3-x_2\end{bmatrix}=\begin{bmatrix}b_1\newline b_2 \newline b_3\end{bmatrix}=\vec{b}\tag{1.3.5}

$$

We said, the matrix and the vectors in the matrix are chosen specially. This particular matrix is called “difference matrix”, because the result $\vec{b}$ is composed of the difference of elements of $\vec{x}$. If we assume that the element before $x_1$ is $x_0$ and $x_0 = 0$, and we could get:

$$

\begin{bmatrix}x_1-x_0\newline x_2-x_1\newline x_3-x_2\end{bmatrix}=\begin{bmatrix}b_1\newline b_2 \newline b_3\end{bmatrix}\tag{1.3.6}

$$

We can see a numerical example:

According the figure, we get our vector $\vec{x}=\begin{bmatrix}2\newline 7\newline 3\end{bmatrix}$ and then we calculate $A\vec{x}$ as:

$$

\begin{bmatrix}

1&0&0\newline

-1&1&0\newline

0&-1&1

\end{bmatrix}

\begin{bmatrix}

x_1\newline

x_2\newline

x_3

\end{bmatrix}=\begin{bmatrix}2\newline 7-2 \newline 3-7\end{bmatrix}=\begin{bmatrix}2\newline 5 \newline -4\end{bmatrix}=\vec{b}\tag{1.3.7}

$$

The result in the figure is $5,-4$ while our reslut is $(2,5,-4)$, that is because of the boundary, or in other words, the first result of the difference operation is meaningless, so it can be any number, or we can just let it go.

In another viewpoint (row picture), we can get a new interpretation:

Using a rows instead of the columns.

Our matrix is come from vectors, each column of a matrix is a vector. So that we calculate $A\vec{x}$ in a column spontaneously. However, we are now changing our viewpoint to a “row” viewpoint – dot products with rows:

$$

A\vec{x}=\begin{bmatrix}

1&0&0\newline

-1&1&0\newline

0&-1&1

\end{bmatrix}

\begin{bmatrix}

x_1\newline

x_2\newline

x_3

\end{bmatrix}

=\begin{bmatrix}

(1,0,0)\cdot(x_1,x_2,x_3)\newline

(-1,1,0)\cdot(x_1,x_2,x_3)\newline

(0,-1,1)\cdot(x_1,x_2,x_3)

\end{bmatrix}\tag{1.3.8}

$$

The result is the same as that in the column way. So, up to now, we have done $A\vec{x}$ in two ways, both of them are useful. However, in different conditions, like vectors and matrices are all numbers( without unknowns) and we need a numerical result, I think the row way is better than the column way to make a rapid calculation. On the contrary, if matrix $A$ , vector $\vec{x}$ and $A\vec{x}$ are just expression of some things or some process, there are parameters or unknowns, the column way is better.

Linear combinations are the key to linear algebra

## Linear Equations

One more change in viewpoint is necessary. Untill now, $A\vec{x}=\vec{b}$ is the process with known $A$ and $\vec{x}$ but unknown $\vec{b}$, and our purpose was to compute $\vec{b}$. However, our new idea is how about make the $\vec{b}$ known and $\vec{x}$ unknown, and our target is no longer $\vec{b}$ but $\vec{x}$.

The old question: Compute the linear combination of $x_1\vec{u}+x_2\vec{v}+x_3\vec{w}$ to find $\vec{b}$

The new one: Which combination of $\vec{u},\vec{v},\vec{w}$ produces a particular vector $\vec{b}$

In the last Section, we considered the $\vec{b}$ as an output of a system $A$ while the $\vec{x}$ is the input, and we can compute the outputs with different inputs by row way or column way. But now how can we comput the the input by known output?

Back to our high school or middle school, we have learned how to solve Equation set. We can regard:

$$

\begin{bmatrix}

1&0&0\newline

-1&1&0\newline

0&-1&1

\end{bmatrix}

\begin{bmatrix}

x_1\newline

x_2\newline

x_3

\end{bmatrix}=\begin{bmatrix}

b_1\newline

b_2\newline

b_3

\end{bmatrix}\tag{1.3.9}

$$

as:

$$

\begin{aligned}

1\times x_1+0\times x_2+0\times x_3&=b_1\newline

-1\times x_1+1\times x_2+0\times x_3&=b_2\newline

0\times x_1-1\times x_2+1\times x_3&=b_2\newline

\end{aligned}\tag{1.3.10}

$$

that is:

$$

\begin{aligned}

1x_1& & &=b_1\newline

-x_1&+x_2& &=b_2\newline

&-x_2&+x_3 &=b_2

\end{aligned}\tag{1.3.11}

$$

We can solve this in any way you like (either from bottom to top or from top to bottom are all right, but from top to bottom may be easier because the matrix $A$, we selected specially, is a lower trianglar matrix), and we get:

$$

\begin{aligned}

x_1&=b_1& & \newline

x_2&=b_1&+b_2& \newline

x_3&=b_1&+b_2&+b_3

\end{aligned}\tag{1.3.12}

$$

It is beautiful and gracful, and let’s look at two specific choices $(0,0,0)$ and $(1,3,5)$ of the $\vec{b}$:

- when $\vec{b}=(0,0,0)$ we can get:

$$

\vec{x}=\begin{bmatrix}0+0\newline 0+0\newline 0+0+0\end{bmatrix}=\begin{bmatrix}0\newline 0\newline 0\end{bmatrix}\tag{1.3.13}

$$ - when $\vec{b}=(1,3,5)$ we can get:

$$

\vec{x}=\begin{bmatrix}1\newline 1+3\newline 1+3+5\end{bmatrix}=\begin{bmatrix}1\newline 4\newline 9\end{bmatrix}\tag{1.3.14}

$$

The first solution is more important that it looks like. A conclusion is if $\vec{b}$ equals $\vec{0}$, it must have the input $\vec{x}=\vec{0}$. This conclusion is not always true, and for this matrix $A$, the answer is “yes”, but not for every matrix. The following examples will show us how to get a zero vector output without a zero vector input nor a zero matrix(that the matrix are all zero).

If the matrix $A$ is invertible, we can recover $\vec{x}$ from $\vec{b}$

## The Inverse Matrix

On the right hand of (1.3.12), it can be rewrited as a matrix multiply by a vector:

$$

\vec{x}

=\begin{bmatrix}

x_1\newline

x_2\newline

x_3

\end{bmatrix}=S \vec{b}=

\begin{bmatrix}

1&0&0\newline

1&1&0\newline

1&1&1

\end{bmatrix}

\begin{bmatrix}

b_1\newline

b_2\newline

b_3

\end{bmatrix}\tag{1.3.15}

$$

Up to now, we have known $A$ is a difference matrix, and We get $\vec{b}=(1,3,5)$ through the difference matrix $A$ with input $\vec{x}=(1,4,9)$. However, (1.3.15) gives us a matrix, through which, the “new input” $\vec{x}=(1,4,9)$ is converted into $\vec{b}=(1,3,5)$, and we call the matrix $S$ the **sum matrix**.

Sum matrix $S$ is the inverse of the difference matrix $A$, and the (1.3.15) tells us two facts:

1. for every $\vec{b}$ there is only one solution to $A\vec{x}=\vec{b}$

2. A matrix $S$ products $\vec{x}=S\vec{b}$

We are going to talk about the solution of equation $A\vec{x}=\vec{b}$ in the next chapters. In linear algebra, the notation for the “inverse matrix” is $A^{-1}$:

$$

A\vec{x}=\vec{b} \text{ is solved by }\vec{x}=A^{-1}\vec{b}=S\vec{b}\tag{1.3.16}

$$

## Cyclic Differences

Different matrices have different meaning, the matrix $A$ above is a difference matrix, if we alter it a little, we can get a new one, with definitely different meaning.

**Cyclie**:

$$

C\vec{x}=

\begin{bmatrix}

1&0&-1\newline

-1&1&0\newline

0&-1&1

\end{bmatrix}

\begin{bmatrix}

x_1\newline

x_2\newline

x_3

\end{bmatrix}=

\begin{bmatrix}

x_1-x_3\newline

x_2-x_1\newline

x_3-x_2

\end{bmatrix}

=\vec{b}\tag{1.3.17}

$$

New matrix $C$ is not triangular. So it’s not easy to find the $\vec{x}$ when given $\vec{b}$. Actually it’s impossible to find the solution to $C\vec{x}=\vec{b}$, because there can be infinitely solution or no solution.

1. When $\vec{b}=(0,0,0)$ , every vector with a same number (like $\vec{x}=(c,c,c)$) is a solution.

2. When $\vec{b}=(x,y,z)$ where $x+y+z\neq 0$ , there is no solution, because the left hand of the equantion is $\begin{bmatrix}x_1-x_3\newline x_2-x_1\newline x_3-x_2\end{bmatrix}$ , their sum is $0$ while the right hand is $x+y+z$, not $0$, so there is no $\vec{x}$ can get those $\vec{b}$

The solution of linear equantions is an important question in linear algebra, many useful concepts are developed on it.

## Independence and Dependence

Independence and Dependence are concepts of vectors not matrix. Linear combination of vector(s) can fill a line, a plane, or a space. If two vectors are Independent, they are not in a same line. If the three vectors are Independent, they are not in the same plane:

Independence: $\vec{w}$ is not in the plane of $\vec{u}$ and $\vec{v}$

Dependence: $\vec{w^{\star}}$ is in the plane of $\vec{u}$ and $\vec{v}$

Let’s look at the martix $C=\begin{bmatrix}1&0&-1\newline-1&1&0\newline0&-1&1\end{bmatrix}$ , it has three vectors:

$$

\begin{aligned}

\vec{u}&=\begin{bmatrix}1\newline -1 \newline 0\end{bmatrix}\newline

\vec{v}&=\begin{bmatrix}0\newline 1\newline -1\end{bmatrix}\newline

\vec{w^\star}&=\begin{bmatrix}-1\newline 0 \newline 1\end{bmatrix}

\end{aligned}\tag{1.3.18}

$$

The important point is that the new $\vec{w^\star}$ is a linear combination of $\vec{u}$ and $\vec{v}$ :

$$

\vec{w^\star}=-\vec{u}-\vec{v}

$$

Linear combination of these three vectors fills into the same plane which is definited by linear combination of $\vec{u}$ and $\vec{v}$, that is to say, $\vec{w^\star}$ have been in the plane already.

But the matrix $A$ in (1.3.3) is not. Its column are Independent, their linear combination fills a 3-dimension space. **Independence** and **Dependence** of this problem is:

$\vec{u}$, $\vec{v}$ and $\vec{w}$ are independent. No combination except $0\vec{u}+0\vec{v}+0\vec{w}=\vec{0}$ gives $\vec{b}=\vec{0}$

$\vec{u}$, $\vec{v}$ and $\vec{w}$ are dependent. Many combinations $c\vec{u}+d\vec{v}+e\vec{w}=\vec{0}$ gives $\vec{b}=\vec{0}$

It’s also useful in more than 3-dimension vectors. When these vectors went into the column of matrix:

$\vec{u}$, $\vec{v}$ and $\vec{w}$ are independent. $A\vec{x}=\vec{0}$ has one solution. $A$ is an

invertible matrix

$\vec{u}$, $\vec{v}$ and $\vec{w^\star}$ are dependent. $A\vec{x}=\vec{0}$ has many solutions. $A$ is asingular matrix

The matrices here are all square($n\times n$), while, the problem of $m\times n$ matrices problem will be discussed in chapter 3

## Conclusion

- $A\vec{x}$ is combination of column of $A$
- The solution to $A\vec{x}=\vec{b}$ is $\vec{x}=A^{-1}\vec{b}$ when $A$ is invertible
- The difference matrix $A$ is inverted by the sum matrix $S=A^{-1}$
- Cyclic matrix $C$ has no inverse.
- Dependence and Independence

## Reference

1.Strang G, Strang G, Strang G, et al. Introduction to linear algebra[M]. Wellesley, MA: Wellesley-Cambridge Press, 1993.

## Leave a Reply