## Linear Algebra, Matrix Operations

### Matrix Operations

Two matrices of the same size can be added together by adding their corresponding entries.

A matrix can be scaled by scaling all its entries.

Low and behold, the set of 4×7 matrices forms a vector space over R, if R is a division ring. But this isn't very interesting by itself; it's just the standard vector space of dimension 28. The power of matrices doesn't appear until we multiply them together.

Two matrices can be multiplied if the number of columns in the first equals the number of rows in the second. If the product matrix is p, pi,j is computed using the ith row of the first matrix and the jth column of the second. Note that these two vectors have the same number of elements. Take their dot product, i.e. the sum of the products of the corresponding entries. Here is the procedure.

```for(i = 0; i < a.numrows; ++i) {
for(j = 0; j < b.numcols; ++j) {
p[i,j] = 0;
for(k = 0; k < a.numcols; ++k)  p[i,j] += a[i,k] * b[k,j];
}
}
```

### Matrices Form A Ring

Let S be the set of n×n square matrices over a ring R. Verify the following.

• S is an abelian group under matrix addition.
• The zero matrix (all zeros) is the additive identity in S.
• Matrix multiplication distributes over addition, left and right. Remember, R might not be commutative.
• The identity matrix is the multiplicative identity in S.
• Matrix multiplication is associative.

Only the last is tricky. Let p = (a*b)*c, or a*(b*c), and consider the entry pi,j. You'll need the ith row of a*b and the jth column of c, or the ith row of a and the jth column of b*c. It comes out the same either way, the sum of ai,k×bk,l×cl,j, as k and l cover all the entries in b. I'll leave the algebra to you.

The set S is now a ring, with an embedded subring isomorphic to R, implemented by scaled copies of the identity matrix. Also, S is a left or right R module.

Note that S is rarely commutative. Try multiplying [0,1|0,0] by [0,0|1,0], and then reverse it; the answer depends on the order of the operands.