One of the most fun things to do in math is to add things up, lots and lots of them, the more the better! It's clear, of course, that if you add up a finite number of things—numbers, vectors, polynomials, matrices, whatever—you'll get a finite answer. Things get more interesting when you add up an infinite number of things. But as I've blogged about at length earlier, there are many different meanings of the word “infinite”. So for now, let's just limit ourselves to talking about a countably infinite list of things.
First, a few terms. A sequence is nothing more than a list, usually indexed so you can refer to every element unambiguously. It's usually written \( \{ a_0, a_1, a_2, \dots, a_n, \dots \} \). A(n infinite) series is what you get when you add up all the terms of a sequence. This will look like \( a_0 + a_1 + \dots + a_n + \dots \), and is sometimes more compactly written in “sigma-notation” as \( \sum_{i=0}^{\infty} a_i \) (read as “sum from i equals 0 to infinity of a sub i ” or something similar). A series is said to diverge if its value is infinite, and to converge if it adds up to a finite number.
It's immediately clear, of course, that a series like \( 1 + 1 + 1 + \dots \) is going to “diverge” to infinity. (The proof is trivial.) Furthermore, what this means is that even if you take the tiniest number you can imagine, whether that is one-half or one-quadrillionth or one-googolplexth, if you add it to itself an infinite number of times, then you're going to get infinity. (The proof is trivial: any series of the form \( x + x + x + \dots \) is the same as \( x \cdot (1 + 1 + \dots ) \), and we already know that that series diverges.)
So what this means is that the terms of our list have to get successively smaller and smaller if we are to get a series that converges. So what about the series \( 1 + \frac{1}{2} + \frac{1}{3} + \dots + \frac{1}{n} + \dots \)? This famous series is called the harmonic series. We can clearly see that each term of the harmonic series shrinks to zero. What does this series converge to?
As it turns out, the harmonic series does not converge at all! There are a number of proofs for this, but the simplest one I remember is this:
Take the series \( 1 + \frac{1}{2} + \frac{1}{2} + \frac{1}{2} + \dots \), which will clearly diverge. Now rewrite it as \( 1 + \frac{1}{2} + \left( \frac{1}{4} + \frac{1}{4} \right) + \left( \frac{1}{8} + \frac{1}{8} + \frac{1}{8} + \frac{1}{8} \right) + \dots \)
Comparing this term-by-term and bracket-by-bracket to the harmonic series, we find that
\[ \array{ 1 & = & 1 \\ \frac{1}{2} & = & \frac{1}{2} \\ \frac{1}{3} + \frac{1}{4} & > & \frac{1}{4} + \frac{1}{4} \\ \frac{1}{5} + \frac{1}{6} + \frac{1}{7} + \frac{1}{8} & > & \frac{1}{8} +\frac{1}{8} + \frac{1}{8} + \frac{1}{8} } \]
and so on. But if the series on the right-hand side already diverges, then the harmonic series, which is clearly bigger from its third term onwards, has to diverge too!
I'm not going to get into the various tests that we can perform to check if our favorite series converges or diverges. However, just in case you think at this point that it can never be possible to add up an infinite number of things to get a finite number (à la Zeno), consider \( \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \dots \) the series, or, in our mathematical notation, \( \sum_{i=1}^{\infty} 2^{-i} \).
Each term is half the size of the previous term; this sort of series is called a geometric series. Without getting into all the mathematical details of proving how some geometric series can converge, it can still be instructive to compare the ratio of two successive terms of our series with two successive terms of the harmonic series.
The ratio of two successive terms \( \frac{a_{i+1}}{a_i} \) for the geometric series is the constant \( \frac{1}{2} \) , as is clear to us from the way we constructed the series in the first place. For the harmonic series, this works out to \( \frac{\frac{1}{n+1}}{\frac{1}{n}} \) , or, less clumsily, \( \frac{n}{n+1} \).
As \( n \) gets larger and larger, the terms of the geometric series get smaller and smaller at the same rate. But in the case of the harmonic series, the ratio gets closer and closer to \( 1 \) as \( n \) gets larger and larger; in other words, even though the terms of the harmonic series are getting smaller, they're nevertheless not getting smaller “fast enough”, whatever that means.
Anyway, if we take as a given that our geometric series converges, how do we find out what this value in fact is? Well, we could simply add up the terms one by one and see what we get, but this can often be a slow process, and furthermore, this process can only give us an approximation to the final answer. Is there some nifty way to find out the exact value to which our geometric series converges?
As it turns out, there is a fun way to do this. Since we “know” that our series converges to a finite value, let's call this value \( S \), for “sum”. That is \( S = \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \dots \)
What's more, we can tell that this easily generalizes to any geometric series where the ratio of successive terms is less than one (obviously, since if the ratio is greater than 1, the terms are growing larger and the series necessarily has to diverge). So if we write this ratio in general as \( x \), we can write our geometric series as \( a_0 + a_0 x + a_0 x^2 + a_0 x^3 + \dots \), or as \( \sum_{i=0}^{\infty} a_0 x^i \) . Now, it's clear that \( a_0 \) is common to all of them and can simply be extracted from the entire sum, so that really the only geometric series we need is \( \sum_{i=1}^{\infty} x^i = 1 + x + x^2 + \dots \).
First, a few terms. A sequence is nothing more than a list, usually indexed so you can refer to every element unambiguously. It's usually written \( \{ a_0, a_1, a_2, \dots, a_n, \dots \} \). A(n infinite) series is what you get when you add up all the terms of a sequence. This will look like \( a_0 + a_1 + \dots + a_n + \dots \), and is sometimes more compactly written in “sigma-notation” as \( \sum_{i=0}^{\infty} a_i \) (read as “sum from i equals 0 to infinity of a sub i ” or something similar). A series is said to diverge if its value is infinite, and to converge if it adds up to a finite number.
It's immediately clear, of course, that a series like \( 1 + 1 + 1 + \dots \) is going to “diverge” to infinity. (The proof is trivial.) Furthermore, what this means is that even if you take the tiniest number you can imagine, whether that is one-half or one-quadrillionth or one-googolplexth, if you add it to itself an infinite number of times, then you're going to get infinity. (The proof is trivial: any series of the form \( x + x + x + \dots \) is the same as \( x \cdot (1 + 1 + \dots ) \), and we already know that that series diverges.)
So what this means is that the terms of our list have to get successively smaller and smaller if we are to get a series that converges. So what about the series \( 1 + \frac{1}{2} + \frac{1}{3} + \dots + \frac{1}{n} + \dots \)? This famous series is called the harmonic series. We can clearly see that each term of the harmonic series shrinks to zero. What does this series converge to?
As it turns out, the harmonic series does not converge at all! There are a number of proofs for this, but the simplest one I remember is this:
Take the series \( 1 + \frac{1}{2} + \frac{1}{2} + \frac{1}{2} + \dots \), which will clearly diverge. Now rewrite it as \( 1 + \frac{1}{2} + \left( \frac{1}{4} + \frac{1}{4} \right) + \left( \frac{1}{8} + \frac{1}{8} + \frac{1}{8} + \frac{1}{8} \right) + \dots \)
Comparing this term-by-term and bracket-by-bracket to the harmonic series, we find that
\[ \array{ 1 & = & 1 \\ \frac{1}{2} & = & \frac{1}{2} \\ \frac{1}{3} + \frac{1}{4} & > & \frac{1}{4} + \frac{1}{4} \\ \frac{1}{5} + \frac{1}{6} + \frac{1}{7} + \frac{1}{8} & > & \frac{1}{8} +\frac{1}{8} + \frac{1}{8} + \frac{1}{8} } \]
and so on. But if the series on the right-hand side already diverges, then the harmonic series, which is clearly bigger from its third term onwards, has to diverge too!
I'm not going to get into the various tests that we can perform to check if our favorite series converges or diverges. However, just in case you think at this point that it can never be possible to add up an infinite number of things to get a finite number (à la Zeno), consider \( \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \dots \) the series, or, in our mathematical notation, \( \sum_{i=1}^{\infty} 2^{-i} \).
Each term is half the size of the previous term; this sort of series is called a geometric series. Without getting into all the mathematical details of proving how some geometric series can converge, it can still be instructive to compare the ratio of two successive terms of our series with two successive terms of the harmonic series.
The ratio of two successive terms \( \frac{a_{i+1}}{a_i} \) for the geometric series is the constant \( \frac{1}{2} \) , as is clear to us from the way we constructed the series in the first place. For the harmonic series, this works out to \( \frac{\frac{1}{n+1}}{\frac{1}{n}} \) , or, less clumsily, \( \frac{n}{n+1} \).
As \( n \) gets larger and larger, the terms of the geometric series get smaller and smaller at the same rate. But in the case of the harmonic series, the ratio gets closer and closer to \( 1 \) as \( n \) gets larger and larger; in other words, even though the terms of the harmonic series are getting smaller, they're nevertheless not getting smaller “fast enough”, whatever that means.
Anyway, if we take as a given that our geometric series converges, how do we find out what this value in fact is? Well, we could simply add up the terms one by one and see what we get, but this can often be a slow process, and furthermore, this process can only give us an approximation to the final answer. Is there some nifty way to find out the exact value to which our geometric series converges?
As it turns out, there is a fun way to do this. Since we “know” that our series converges to a finite value, let's call this value \( S \), for “sum”. That is \( S = \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \dots \)
So what is \( \frac{1}{2} S\) then? \( \frac{1}{2} S = \frac{1}{4} + \frac{1}{8} + \dots \)
But wait! This is almost the same series as our initial series! Let's subtract: \[ \array{ S - \frac{1}{2} S & = & \left( \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \dots \right) - \left( \frac{1}{4} + \frac{1}{8} + \dots \right) \\ \frac{1}{2} S & = & \frac{1}{2} \\ S & = & 1 } \]
In other words, \( \frac{1}{2} + \frac{1}{4} + \dots = 1 \) . Cool!
What's more, we can tell that this easily generalizes to any geometric series where the ratio of successive terms is less than one (obviously, since if the ratio is greater than 1, the terms are growing larger and the series necessarily has to diverge). So if we write this ratio in general as \( x \), we can write our geometric series as \( a_0 + a_0 x + a_0 x^2 + a_0 x^3 + \dots \), or as \( \sum_{i=0}^{\infty} a_0 x^i \) . Now, it's clear that \( a_0 \) is common to all of them and can simply be extracted from the entire sum, so that really the only geometric series we need is \( \sum_{i=1}^{\infty} x^i = 1 + x + x^2 + \dots \).
So how do we evaluate this? We can do the same thing as we did earlier
\[ \array{ S & = & 1 + x + x^2 + \dots \\ x S & = & x + x^2 + x^3 + \dots \\ (1-x) S & = & 1 \\ S & = & \frac{1}{1-x} } \]
\[ \array{ S & = & 1 + x + x^2 + \dots \\ x S & = & x + x^2 + x^3 + \dots \\ (1-x) S & = & 1 \\ S & = & \frac{1}{1-x} } \]
Beautiful!
Let's take the series \( 1 + x + x^2 + x^3 \cdots \) again. So far we've just been interested in the x-side of things. But there is another way to look at them: as the infinite sequence \( (1, 1, 1, \dots) \) ; in other words, we can shift our focus to the coefficients of the series. When we move away from the x to the coefficients, we must temporarily abandon our earlier restraints on x as being less than 1 and all that; we are considering this as a formal power series, where the xs are simply “a clothesline on which we hang up a sequence of numbers for display”, in the words of Herbert Wilf in his remarkable book generatingfunctionology. This immediately tells us that the geometric series is nothing more than one particular kind of formal power series (if this wasn't clear to us earlier), and suggests to us that there may be many more formal power series out there that we can try to sum up! From the perspective of formal power series, writing
\[ 1 + x + x^2 + x^3 + \dots = \frac{1}{1-x} \]
\[ 1 + x + x^2 + x^3 + \dots = \frac{1}{1-x} \]
means that the series \( 1 + x + x^2 + x^3 + \dots \) and the series \( 1 - x \) are formally inverses of each other. If we then decide to take the \( x \)-side of things into account (i.e., if we consider \( 1 + x + x^2 + x^3 + \dots \) no longer as a formal power series but as an actual series where the value of \( x \) x matters), then for all values of \( x \) x where the series converges, we can use its inverse series to quickly compute its value.
Take another formal power series, this time based on the sequence \(1, 2, 3, 4, \dots \). We would write this formal power series as \( 1 + 2x + 3x^2 + 4x^3 + \dots \) . How do we add this up? Here, things get quite interesting, since there are at least two ways to do this:
Method the First:
We have
\[ \array{ 1 + 2x + 3x^2 + 4x^3 + \dots & = & 1 + (x + x) + (x^2 + x^2 + x^2) + (x^3 + x^3 + x^3 x^3) + \dots \\ & = & (1 + x + x^2 + \dots ) + x (1 + x + x^2 + \dots ) + x^2 (1 + x + x^2 + \dots ) \\ & = & (1 + x + x^2 + \dots )^2 \\ & = & \left( \frac{1}{1-x} \right) ^2 } \]
Method the Second:
If we know some calculus, we know that the first derivative of \( x^n \) is \( \frac{d}{dx}x^n = nx^{n-1} \). Now, it should be clear from this that our formal power series can be written as a set of derivatives.
\[ \array{ 1 + 2x + 3x^2 + 4x^3 + \dots & = & \frac{d}{dx} x + \frac{d}{dx} x^2 + \frac{d}{dx} x^3 + \frac{d}{dx} x^4 + \dots \\ & = & \frac{d}{dx} \left( x + x^2 + x^3 + x^4 + \dots \right) \\ & = & \frac{d}{dx} \frac{x}{1-x} \\ & = & \frac{1}{1-x} + \frac{x}{(1-x)^2} \\ & = & \frac{1}{(1-x)^2} } \]
Method the First:
We have
\[ \array{ 1 + 2x + 3x^2 + 4x^3 + \dots & = & 1 + (x + x) + (x^2 + x^2 + x^2) + (x^3 + x^3 + x^3 x^3) + \dots \\ & = & (1 + x + x^2 + \dots ) + x (1 + x + x^2 + \dots ) + x^2 (1 + x + x^2 + \dots ) \\ & = & (1 + x + x^2 + \dots )^2 \\ & = & \left( \frac{1}{1-x} \right) ^2 } \]
Method the Second:
If we know some calculus, we know that the first derivative of \( x^n \) is \( \frac{d}{dx}x^n = nx^{n-1} \). Now, it should be clear from this that our formal power series can be written as a set of derivatives.
\[ \array{ 1 + 2x + 3x^2 + 4x^3 + \dots & = & \frac{d}{dx} x + \frac{d}{dx} x^2 + \frac{d}{dx} x^3 + \frac{d}{dx} x^4 + \dots \\ & = & \frac{d}{dx} \left( x + x^2 + x^3 + x^4 + \dots \right) \\ & = & \frac{d}{dx} \frac{x}{1-x} \\ & = & \frac{1}{1-x} + \frac{x}{(1-x)^2} \\ & = & \frac{1}{(1-x)^2} } \]
No comments:
Post a Comment