Sequences and Series, Absolute and Conditional Convergence

Absolute and Conditional Convergence

The series s of real numbers is absolutely convergent if |sj| converges.

A series that converges, but is not absolutely convergent, is conditionally convergent.

Let s be a conditionallly convergent series of real numbers. Consider first the positive terms of s, and then the negative terms of s. If the partial sums of the positive terms of s are bounded, and the partial sums of the negative terms of s are bounded, then s would be absolutely convergent. So either the positive terms or the negative terms sum to infinity. If the positive terms sum to infinity and the negative terms add up to a finite value, s would not be convergent. Thus the positive terms add up to +∞ and the negative terms add up to -∞. We can use this to rearrange terms and make s add up to anything we like.

Let w be an arbitrary limit. Build a new series r as follows. Bring in positive terms from s until the sum of r exceeds w. Then bring in negative terms until the sum drops below w. Then bring in positive terms until the sum exceeds w. Then bring in negative terms until the sum drops below w. Repeat this process, building a new series r that uses all the terms of s, but in a different order.

Given ε, find n such that all the terms of s beyond n are inside ±ε. Once the first n terms have been used up, and the new series r crosses w, it never strays farther than ε from w. We're always moving towards w, taking steps of size < ε. The partial sums of r will always remain within ε of w. Therefore the new series r approaches w.

When s is absolutely convergent, terms can be permuted with impunity. Extract only the positive terms of s and note that this is dominated by the absolute value of s, which converges. Hence the positive terms create a series that converges to p. Similarly, let the negative terms converge to q. Note that p ≥ 0 and q ≤ 0.

If the terms are rearranged, the positive terms still sum to p monotonically and the negative terms still sum to q monotonically. Given ε there is some n where the positive terms add up to more than p-ε, and the negative terms add up to less than q+ε. At this point, and all points beyond, the partial sum of the rearranged series is within ε of p+q. Thus the rearranged series approaches p+q, just like the original.

The above can be generalized to complex numbers or finite vectors. By definition, such a series is absolutely convergent if all the component series are absolutely convergent. Again, terms can be rearranged with impunity. Each component is absolutely convergent, and produces the same limit regardless of order. Since the components have the same limits, s has the same limit.

Rearrangements Within Blocks

Some rearrangements are acceptable for any series. For instance, you can swap the first two terms, then the next two, then the next two, and so on.

a1 + a0 + a3 + a2 + a5 + a4

Let the original series sum to s. Given ε, find an n such that the partial sums of the original series (beyond n) are within ε/2 of s, and at the same time, the terms of the series (which go to zero) are bounded by ε/2. Now consider a partial sum of our rearranged series, beyond n. If the partial sum contains an even number of terms, nothing has changed, and we're ok. If the number of terms is odd, set the last term aside. The remaining partial sum is within ε/2 of s, and the last term, which we can bring back in, is bounded by ε/2. hence the partial sum is within ε of s. The rearranged series converges to s.

We don't have to swap all the pairs in the above example. We can swap some pairs and not others. The same proof applies.

As a generalization, carve up the series into blocks of size k. We can permute the terms in each block, any way we like, and the sum is the same. Apply the above proof, using ε/k instead of ε/2.

Even stranger regroupings are possible. For instance, divide the series up into blocks of 3. If a block contains x y and z, we might replace it with this block.

0.2x + 0.8y + 0.6z, 0.5x + 0.1y + 0.1z, 0.3x + 0.1y + 0.3z

Note that the block sums to x+y+z, as it did before. Each new term consists of pieces of the old terms, and the coefficients on x y and z are always bounded by 1. If this restriction holds for each block, the sum is unchanged. Select an ε and let γ = ε over k2. Move out in the series, so that partial sums are within γ of s, and terms are bounded by γ. Consider a new partial sum, and back up to the previous complete block. This sum is within γ of s. Each term in the incomplete block is, at worst, kγ, and there are at most k-1 such terms. Thus the sum is within k2γ of s, within ε of s, and that completes the proof.