## Linear Algebra, Gram Schmidt Process

### Gram Schmidt Process

Any set of n independent vectors can be converted into a set of n orthogonal vectors.
This assumes an appropriate dot product, such as the standard dot product
in the rationals, reals, or complex numbers.
By induction, assume the first n-1 independent vectors have been transformed into
orthogonal vectors spanning the same subspace.
Call these orthogonal vectors x1 x2 … xn.
Let y be the next independent vector.
For each i in 1 to n,
subtract xi × (xi.y) over (xi.xi) from y.
Call this new vector z, or if you prefer, xn+1.
Verify that z.xi is 0 for each i in 1 to n.
Since z and y differ by a linear combination of x vectors,
the same space is spanned.
Thus z becomes the next vector in the orthogonal set.
This continues until the entire basis has been transformed.

Here is a simple example in two dimensions.
Let v be the vector [4,0], pointing along the x axis,
and let w = [1,1], pointing up and to the right.
Now v and w form the bottom and left side of a parallelogram.
The Gram Schmidt process pushes this parallelogram back into a rectangle.
Subtract ¼ of v from w, hence w becomes [0,1].
Now w points straight up, and the vectors are orthogonal.

Gram Schmidt can be applied to a countable basis as well.
We'll do this in the next section.
For instance, real polynomials form an infinite vector space over the reals,
with basis xn,
and this basis can be transformed into an orthogonal basis of polynomials
using the Gram Schmitd process.