You are on page 1of 6

Welcome to Calculus.

I'm Professor Ghrist.


We're about to begin lecture 53 on
absolute and conditional convergence.
Infinite series either converge or
diverge.
Right?
Well, the real situaion is a bit more
complitcated than that.
In this lesson, we'll focus attention on
series that have both positive and
negative terms.
And we'll refine our notion of convergence
into absolute and conditional.
When it comes to determining convergence
or divergence, we have
at our disposal a collection of tests.
Sometimes those tests work well.
Sometimes, well, it's not so obvious how
to proceed.
Consider the following
three series, all of which look very
complicated.
Following the rule of doing the ratio test
first, is
maybe not such a good idea in this
particular case.
But, I claim that these series are easy to
figure out the first to converge, the last
diverges.
Why is it so easy?
It is easy, because of this term
in front, the negative 1 to the n.
A term which
ironically renders most of
our prior tests useless.
However, these are
examples of alternating series, series
whose terms
alternate between positive and negative.
Such series are
very easy to work with.
One
often writes them in the form sum over n
of -1 to the
n a sub n, where the a sub n terms are
positive.
Let's do one last test,
the alternating series test.
It begins as follows.
If we have a decreasing positive sequence,
a sub n, then, we're going to look at the
limit of these terms, just as in the nth
term test.
The alternating series, sum of n negative
1 to the n, a sub n,
converges if and only if the limit is 0,
and it diverges
if and only if, the limit is non-zero.

This is
in contradistinction to the nth term test,
which was not if and only if.
Now, the bad part of this test is, it's
not very applicable.
You must have an
alternating series with decreasing terms.
However, this
test is as easy as can be.
And because of the
ubiquity of alternating series, it's
really quite useful.
How can it be that we get such a strong
result for
this test?
Well, let us consider the convergence
of an alternating series.
Of course, we will look at the sequence of
partial sums.
That is, the sum as n goes from 0 to sum
t, negative 1 to the n, a sub n.
What does the convergence of this sequence
look like?
Well, if it is an alternating series, then
that means we start
at a zero, we jump backwards by the amount
a one.
Then we jump forwards by the amount a two.
Backwards by the amount a three, et
cetera.
If, as per our hypotheses, the sequence of
a sub
n is decreasing, that means our jumps get
smaller and smaller.
This means in the limit, there are only
two possibilities.
If the limit of the a sub n is zero, then
our
jump size decreases to zero, and our
series converges.
If the limit of the a sub n is non-zero,
then we oscillate back and forth by that
limiting
jump amount.
As per the nth term test, the series does
not converge.
Well, let's see how this test works in the
context
of the three series with which we began
our lecture.
All three of these are alternating series.
And hence, the alternating series test
applies.
The first is the sum of negative 1 to the
n, pi to the n, log n squared plus 1,
over n factorial times hyperbolic cosine
then to the 5 3rds.
Now, looking at this, it's pretty obvious
that the n-factorial term dominates all
others.

Hence, the limit of the a sub n is zero,


and this series converges.
No other work necessary.
Likewise, in the second
series, the sum, negative 1 to the n of 1
over log of log
cubed of log to the fifth of n.
This one also has terms that are going to
zero obviously.
Hence, it converges without further work.
The last example, involving the cube root
of n to the fifth
minus 3n cubed plus 2 over the square root
of n cubed plus 9 n square plus 1.
Well, that's not half so complicated as it
looks.
We could've easily computed the leading
order terms of both of these
and discerned through the nth term test
that this diverges.
This is true whether or not
it's an alternating series.
In all these cases, it's pretty simple.
The contemplation of alternating series
leads us to new definitions involving
general series.
We say, that a series is absolutely
convergent
if not only does the sum of the a sub n
converge, but the sum of
the absolute values of the a sub n terms
also
converges.
We say that a series is
conditionally convergent if it is a
convergent series.
That is, the sum of the a sub n's
converges.
But the sum of the absolute values of the
a sub n's diverges.
Of course, a series that only has positive
terms in it can't be
conditionally convergent.
A conditionally convergent series is one
that has enough negative terms or maybe
alternating terms,
so as to force a weaker sort of
convergence.
Both of these are in distinction
to a divergent series which does not
converge
at all.
With these definitions, there are
exactly three possibilities for a series.
If you're given the sequence, then the
series, the sum of the a sub n's
might diverge if it converges
one checks the convergence, the absolute
values.
If that does not converge, then you have a

conditionally convergent series.


Otherwise, you have an absolutely
convergent series.
These three are mutually exclusive.
Every series
is one of these three types, and only one.
Let's see some examples.
If we take the sum from 1 to infinity of
negative 1 to the n plus 1, times 1 over n
squared.
This is 1 minus a fourth plus a ninth
minus 1 16th,
etcetera, what can we say about this?
Well, it definitely
converges based on the alternating series
test.
When we take the absolute values, what do
we get?
We get a p series with
p equals 2.
That also converges.
Therefore, this is an example of an
absolutely convergent series.
On the other hand, if we take what we
might call the alternating harmonic
series, negative 1 to the n plus 1 over n.
Then by the alternating series test,
this does converge.
However, when we take the absolute
values of the terms, we get the harmonic
series which diverges.
Therefore, this is the canonical example
of a
conditionally convergent series.
If we take
the sum of negative 1 to the n, well,
we've already seen that that
is a divergent series.
And taking away the negative signs is not
going to change that.
Not at all.
In general, if we look at an alternating p
series, where we take 1 over n to the p
and have the signs alternate on it.
Then,
instead of having convergence,
that's p bigger than 1, and divergence for
p less than or equal to 1.
In this alternating scenario, we have
absolute convergence for p strictly bigger
than one, and conditional convergence for
p less than or equal to one.
For the remander of this course, we're
going to go back and
reconsider all of the convergences that we
have
seen or implied, and revisit them in light
of our new definitions.
Recall, from the very beginning of this
class,

how we showed that log of 1 plus x is the


alternating
sum of x to the n over n.
Now, you remember and I
remember that this only works when x
is strictly less than 1 in absolute value
just like
the geometric series.
However, I claim that we could bend the
rules just a little bit.
What happens when we substitute x equals 1
into
this formula.
We seem to get that log of 2 is
1 minus a half plus a third minus a fourth
plus a fifth, etcetera.
Is this true?
In the past, I've asked you to trust me.
But now, you don't need me because
you have enough tools at your disposal.
The series on the
right does converge by the alternating
series test.
However, since it is an alternating
harmonic series, it's
convergence is conditional.
It limits to
the value on the left log of 2.
Now
recall that this series is a little bit
tricky.
We've argued in the past that by
rearranging the
terms and combining in the appropriate
manner,
we seem to get something that is a
different value,
namely 3 half log 2, simply by
rearranging the terms in the original
series that
was presented as a mystery.
Here, this
presented as a warning.
The real reason for that strange behavior
comes from the conditional
convergence of the series.
Whenever you have a conditionally
convergent series, you should be cautious.
It is a dangerous situation.
You are so close to having a divergent
series, you've got to be careful.
On the other hand, when you have
an absolutely convergent series, you're in
great shape.
It is a theorem that you can rearrange the
terms in an absolutely convergent series
at will as you wish and the sum remains
constant.
Absolutely convergent series are very
robust, and will

be very helpful when you're working with


Taylor series.
On the other hand, this is not the case
for conditionally convergent series.
It is a result who's proof will not fit in
this margin.
That given, the conditionally convergent
series, you can rearrange the terms
to sum up to any number you wish.
They're a bit dangerous.
Beware of them, but trust in absolute
convergence.
The distinction between absolute and
conditional
convergence may seem a little academic.
After all, what does it matter to
applications in sciences?
Ooh, it makes a big
difference, as we'll see in our next two
lessons.
Which will take us back to the very
beginnings of this course.

You might also like