Integral Extensions, Finitely Generated = Integral

Finitely Generated = Integral

This is an important theorem in commutative algebra, and it's rather counterintuitive. It says that the sum or product of integral elements is integral. Now why should that be? It makes sense with algebraic elements. Add or multiply two algebraic elements together and you're still part of a finite extension, not too far off the floor. The result should still be accessible from some polynomial in the base ring. But if u and v are roots of monic polynomials, why should u+v be the root of a monic polynomial? Why not some other polynomial with a lead coefficient of 317? I can't provide an intuitive explanation; I can only offer the proof, and hope you can glean some intuition from that.

Warning, this proof uses a matrix to represent an endomorphism from a finitely generated module into itself, and then takes the determinant of that matrix. If these concepts are foreign to you, you may want to review module endomorphisms, and determinants.

Let t be an element in the ring extension S/R. If t is integral then the extension R[t] is generated by the powers of t, from t0 up to tn-1, where the monic polynomial of t has degree n. In other words, the ring R[t] is a finitely generated R module inside S.

Note, if t were merely algebraic, like 2t2-3, we would have to replace t2, wherever it occurs, with 3/2, and 3/2 is not part of our base ring. So we really need t to be integral over R.

Now for the converse. Let M be a finitely generated R module in S, where M contains R[t]. Note that M could be R[t], as above, or it could be something larger.

Furthermore, let multiplication by t map M into itself. This is true when M is R[t], or S, and perhaps several modules in between. Since R and t both act on M, M is an R[t] module, as well as an R module.

Since M contains R[t] contains 1, the only element from R[t] that drives all of M into 0 is 0. Using terminology from the world of modules, the annihilator of M is 0.

Give M a basis b1 b2 b3 … bn. This is the finite set of generators that spans all of M. Now each member of M is a linear combination of these basis elements. However, the representation may not be unique; B may not be a basis in the truest sense of the word. I'm abusing the terminology just a bit. But B does span, and everything in M has at least one representation relative to the basis B. Since M is an R module, multiplication by x in R multiplies the coefficients on the basis elements by x. However, multiplication by t replaces each bj with its image, which is a prescribed linear combination of the n basis elements. In other words, multiplication by t is described by an n×n matrix, where the jth row defines tbj. Since we've already used the letter M, let's call this matrix Y.

To recap, we can represent an element from M as a vector v, where v holds the coefficients on the basis B. Then v*Y, using matrix multiplication, produces the coefficients of t*v. There may be other linear combinations of the generators in B that produce tv, but vY is definitely one of them.

Now subtract t from the main diagonal of Y. So Y isn't a matrix over R any more, it has become a matrix over R[t]. And vY isn't vt any more, it is vt-vt, or 0. It doesn't matter what v is, run it through Y and get 0. You don't get a zero vector, but you do get a column vector from R[t] whose elements sum to 0.

Let d be the determinant of Y, which is an element of R[t].

Build a matrix Z as follows. Start with the identity matrix, then replace the first column with Zi,1 = bi. In other words, the first column contains our basis.

Consider the product Y*Z, using matrix multiplication. The first row of Y is dotted with the first column of Z. Ignore the -t in the upper left of Y, and the first row of Y contains coefficients of R, that are applied to the basis elements in the first column of Z. But this is, by definition, the image of b1 when multiplied by t. So without the -t, the upper left entry of the product is b1t.

Now bring in -t and subtract b1t, leaving 0. Thus the upper left entry in the product matrix is 0.

Multiply the second row of Y by the first column of Z, and obtain b2t-b2t, or 0. This continues all the way down the column. I don't know what the other columns look like; it doesn't matter. The first column is 0, hence the determinant of the product is 0. In other words, det(Y)det(Z) = 0. The determinant of Y is d, and the determinant of Z is b1. Thus db1 = 0.

Leave the first column of Z alone, and rearrange the ones in the remaining columns. This time arrange the ones so that the jth row is missing a one. Last time the first row was missing a one. This time the determinant of Z is ±bj. Here's an example of Z with determinant b3.

b1 1 0 0 0
b2 0 1 0 0
b3 0 0 0 0
b4 0 0 1 0
b5 0 0 0 1

Evaluate the product YZ, and once again the first column is zero. This means dbj = 0 for every j.

Remember that d is an element in R[t], and it maps every generator bj to 0. By linearity it sends the entire module M to 0. Yet only 0 satisfies 0*M = 0, therefore d = 0.

Remember, d is the determinant of Y, which is zero. Expand the determinant of Y into a polynomial in t, with coefficients in R. This polynomial is monic, and it is equal to zero. Therefore t is the root of a monic polynomial in R, and is integral.

You might think R needs to be an integral domain, so that the product of the determinants is the determinant of the product, but this resul holds for all commutative rings, thanks to a rather technical proof. Thus there are no restrictions on R, other than the usual restrictions for this topic, i.e. being commutative and having 1.

Happy Corollaries

  1. Let S be a finitely generated extension of R. Every t in S belongs to a finitely generated R module inside S, namely S. Therefore every t is integral, and S is an integral extension of R.

  2. Let u and v be integral elements in a ring extension S/R, and consider R[u,v]. Cross the powers of u with the powers of v to show this is finitely generated as an R module. Therefore R[u,v] is an integral extension. Use induction to generalize this to a finite set of integral elements.

  3. If u and v are integral as above, then R[u,v] is an integral extension that contains u+v, u-v, and uv. These are all integral over R.

  4. Let W be the set of integral elements in S/R. We just showed W is closed under addition and multiplication, hence W forms a ring. This is the largest integral extension of R inside S.

  5. Let S be an integral extension of R, and let T be an integral extension of S. Let v be an element of T, and let p(x) be the monic polynomial that proves v is integral over S. Let a0 a1 a2 … an be the coefficients of p, taken from S. Each of these is integral over R. Adjoin these to R and find an integral extension, a finitely generated R module, which we will call G. Now G(v) is a finitely generated G module, which we will call H. Cross the generators of H over G with the generators of G over R to show H is a finitely generated R module. Thus v is integral over R, and since v was arbitrary, T is an integral extension of R. The composition of integral extensions is integral.

  6. Let R be an integral domain, let F be the fraction field of R, let E be an algebraic extension of F, and let S be the integral extension of R inside E. Let B be a basis for E over F. Multiply each basis element by something in R, so that the basis elements are all integral, hence they all lie in S. Now the fraction field of S includes F, and B, hence the fraction field of S is all of E.

  7. Let E/F and S/R be as above. Find a basis for E/F in S and represent the elements of S as unique linear combinations of basis elements with coefficients in F. The tensor product S×F is generated by elements of S crossed with reciprocals of R. Use the basis B, that lies in s, rather than all of s. If q is an arbitrary member of our basis B, consider (x/y)q crossed with 1/z. This is the same as xq cross 1/yz. The tensor product is the free R module generated by B, with denominators from R. This is the linear combinations of B with coefficients in F, which is E. Therefore S×F = E.

  8. Let the ring extension S be a finitely generated R module, which becomes an integral extension. Let S be an integral domain, whence R is also an integral domain. Let F be the fraction field of R and let E be the fraction field of S. Assume S contains a free R module of rank n, and anything in S, outside this free R module, is a linear combination of basis elements with coefficients in F. This reproduces the conditions found above, so S embeds in S tensor F, which lies in E.

    Let b1 through bn be a basis for the free R module inside S. Let x be an element of S, and build a matrix M that represents multiplication by x, relative to this basis. In other words, the first row of M contains the coefficients for b1 through bn that produce x*b1, and so on. Let d be the determinant of M. If d is 0 then there is a vector y such that yM = 0. You might need to dip into the fraction field F to build y, but you can always multiply through by a common denominator, so that y lies in R. View y as an element of S, and yx = 0, which is a contradiction. Therefore d is nonzero.

    Let W be the inverse of M, using the common denominator d. The entries of W lie in F. Let each row of W represent a linear combination of our basis, using coefficients in F. This is an element of S×F, which lives in E. Let the ith row of W represent vi. In other words, vi is the sum over j of Wi,jbj. Now, multiplying Wi by M effectively multiplies vi by x. The result is the identity matrix, hence vi*x = bi. Write 1 as the sum over cibi, and the sum of civi is the inverse of x. Therefore the fractions of S are all covered by the linear combinations of B with coefficients in F. More concisely, E is the tensor product of S and F as R modules. This is the same result we saw above. However, in this case S need not be integrally closed inside E.