Calculus, Functions and Taylor Series

Functions and Taylor Series

Most functions are equal to their taylor series, at least on some neighborhood about 0. This means the error term, described in the previous section, approaches 0 as we consider more accurate, higher degree taylor polynomials. As you recall, the error term at x grows as xn/n!, times the worst nth derivative on the interval [0,x]. If the nth derivative can be tamed, n! dominates xn, and the error term goes to 0. When this happens, the taylor series equals the function that created it.

If f is a polynomial of degree n, then all derivatives beyond n are 0. The taylor series is the taylor polynomial of degree n, and that polynomial happens to be f. This is not surprising; the taylor polynomial ought to equal the polynomial that produced it.

Next consider the exponential function, which increases faster than any polynomial. A finite taylor approximation will never suffice, but how about its taylor series? The derivatives at x are all exp(x), which is a fixed number that we will call b. In fact this is an upper bound for the derivatives between 0 and x. (If x is negative, use 1 as the upper bound.) Thus the error term is bxn/n!, which approaches 0 for large n. The exponential function is equal to its taylor series for all x. The series is:

exp(x) = 1 + x + x2/2 + x3/6 + x4/24 + x5/120 + … + xn/n! + …

The sine and cosine functions have derivatives bounded by 1, and xn/n! approaches 0 for large n, thus sin(x) and cos(x) equal their taylor series everywhere. Here are their taylor series.

sin(x) = x - x3/6 + x5/120 - x7/5040 + …

cos(x) = 1 - x2/2 + x4/24 - x6/720 + …

Since the log function blows up at 0, take its taylor series about 1. The nth derivative of log is ±(n-1)!/xn. (Show this by induction on n.) If we keep x > 1, the derivatives are never higher than (n-1)!. Divide by n!, giving an error term of (x-1)n/n. This goes to 0 for x < 2. The taylor series agrees with log(x) for a while, and then blows up. Here is the series, given in terms of r, where r = x-1.

log(1+r) = r - r2/2 + r3/3 - r4/4 … ± rn/n …

Pathological Example

There are pathological functions that don't equal their taylor series, even for points arbitrarily close to the point of expansion. Consider the function f = exp(-1/x2). Let f(0) = 0. This function is extremely flat near the origin; so flat that all its derivatives at the origin are 0 (see below). Its taylor series is 0, yet the function is obviously not 0 for all x.

Assume the nth derivative of f at a point x, other than 0, is r(x)×exp(-1/x2), where r(x) is a rational expression, the quotient of two polynomials. Verify this by induction on n, using the product rule to differentiate at each step. What happens to the nth derivative as we approach 0?

Replace x with 1/x and take the limit as x approaches infinity. Now we have p(x)×exp(-x2) over q(x). Use L'hopital's rule repeatedly, until the denominator is a constant. This leaves a polynomial times a negative exponential, or a polynomial over a positive exponential. Apply L'hopital's rule again, until the numerator becomes a constant. We are left with c over exp(x2), which approaches 0. Thus the nth derivative approaches 0 at the origin.

Consider the difference quotient r(h)×exp(-1/h2) over h as h approaches 0. Again, r(h) is a rational function, the quotient of two polynomials. Use L'hopital's rule to drop the denominator to 1, leaving a rational function times exp(-1/h2) in the numerator. We already showed this goes to 0. Thus the next derivative exists at the origin, and is 0. All derivatives are 0 at the origin, and f is infinitely differentiable over the entire x axis. Yet f does not equal its taylor series anywhere other than the origin, which is the point of expansion.