Lecture from: 28.02.2024 | Video: Video ETHZ

Divergence and Limit Theorems

This part builds upon our understanding of convergent sequences, introducing the concept of divergence and establishing powerful theorems for manipulating limits.

Divergence: When Sequences Don’t Approach a Limit

Definition: The Absence of Convergence

Recall the definition of convergence: a sequence converges to a limit if for all , there exists such that for all , .

A sequence is divergent if it does not converge to any real number . In other words, there is no single point that the terms of the sequence eventually get arbitrarily close to.

Divergence doesn’t necessarily mean the terms grow infinitely large. A sequence can diverge by oscillating or exhibiting other erratic behavior. For example the sequence diverges as it alternates between and .

Convergence to Infinity: A Special Kind of Divergence

We can extend the notion of convergence to include infinite limits:

  • A sequence converges to infinity () if for all , there exists such that for all , . This signifies that the terms of the sequence eventually exceed any given real number.

  • Similarly, a sequence converges to negative infinity () if for all , there exists such that for all , . The sequence eventually becomes smaller than any chosen threshold.

Example: Exponential Growth Dominates Linear Growth

Consider the sequence for . We will show that .

First, we prove a helpful inequality by induction: for all .

  • Base Case (n=1): . The base case holds.
  • Inductive Step: Assume for some . Then Thus, the inequality holds for . By induction, for all .

Now, let be given. By the Archimedean Principle, there exists such that . For all , we have:

Therefore, for all , . This proves that .

Limit Theorems: Simplifying Limit Calculations

The following theorems, often called Limit Laws, are invaluable tools that allow us to calculate limits of combined sequences if we know the limits of the individual sequences.

Theorem: Algebraic Limit Laws

Let and be convergent sequences with and . Then:

Sum Law

Proof of Sum Law

Let . Since converges to , there exists such that for all , . Similarly, since converges to , there exists such that for all , .

Let . For all (and hence and ), we have:

Thus, by definition, .

Product Law

Proof of Product Law

Let . We need to show that can be made smaller than for sufficiently large . We use the algebraic manipulation:

Since converges to , it is bounded. Let be a bound such that for all and also assume . Choose such that for , and such that for , . Let . Then for :

Thus, .

Quotient Law

If and for all , then

Proof of Quotient Law

Written down quite thoroughly, cause I didn’t get it even after watching it more than 3 times…

To prove the Quotient Law: If , , with and for all , then .

We will tackle this in two main steps:

Step 1: Prove that

This is the core of the Quotient Law proof. We need to show that the reciprocal of the sequence converges to the reciprocal of its limit .

Let’s analyze the difference between and :

Our goal is to make this quantity less than an arbitrary by choosing a sufficiently large . We already know we can make small since . The potential issue is the in the denominator. If could get arbitrarily close to zero, even if is small, the fraction could become large. We need to show that is bounded away from zero for large enough .

Since and , we can use the definition of convergence to ensure stays away from zero. Let’s choose a specific positive value, . Because , there exists an index such that for all ,

Now, we use the reverse triangle inequality: . Applying this with and , we get:

Combining this with the inequality from convergence, for :

This means the difference between the magnitudes and is less than half the magnitude of . Let’s unpack what implies. For any and , is equivalent to . So:

Adding to all parts of the inequality:

Simplifying, we get:

The important part is the left inequality: for all . This tells us that for , is bounded below by a positive constant (), preventing it from getting arbitrarily close to zero.

Now we go back to bounding our original expression (notice the sign flips since is in the numerator):

Let be given. Since , there exists such that for all ,

Let . For all , both conditions ( and ) hold. Thus, for :

This proves that .

Step 2: Apply the Product Law

Now that we have established , we can use the Product Law, which we’ve already proven:

This completes the proof of the Quotient Law. We’ve shown that under the given conditions, the limit of the quotient of two sequences is indeed the quotient of their limits.

Comparison Law

If there exists such that for all , then .

Proof of Comparison Law

Assume for contradiction that . Let . Since , there exists such that for all , , which implies . Similarly, since , there exists such that for all , , which implies .

Let . For , we have both and . This means , so , which contradicts the assumption that for all . Therefore, our assumption must be false, and we conclude .

Example: Applying Limit Laws

Let . We will show that .

Proof: We know that (proved in the previous lecture). Applying the product law repeatedly ( times if is positive), we get:

If is negative, use the quotient rule.

Monotonic Sequences and the Power of Bounded Monotony

Monotonicity describes the directional trend of a sequence. A sequence is monotonic if its terms are consistently either increasing or decreasing. We distinguish between increasing and decreasing, as well as strict and non-strict monotonicity.

Monotonically Increasing Sequences

  1. Monotonically Increasing (Non-decreasing): A sequence is monotonically increasing if for all , . Each term is greater than or equal to the preceding term.

  2. Strictly Monotonically Increasing: A sequence is strictly monotonically increasing if for all , . Each term is strictly greater than the preceding term.

Monotonically Decreasing Sequences

  1. Monotonically Decreasing (Non-increasing): A sequence is monotonically decreasing if for all , . Each term is less than or equal to the preceding term.

  2. Strictly Monotonically Decreasing: A sequence is strictly monotonically decreasing if for all , . Each term is strictly less than the preceding term.

Monotonicity provides valuable information about the long-term behavior of a sequence. Monotonic sequences are “well-behaved” in the sense that they don’t oscillate wildly in direction.

The Weierstrass Monotone Convergence Theorem

The Weierstrass Monotone Convergence Theorem is a fundamental result that guarantees the convergence of bounded monotonic sequences. It is a powerful tool because it allows us to establish convergence without explicitly knowing the limit.

Theorem: Weierstrass Monotone Convergence Theorem

  1. Bounded and Monotonically Increasing: If a sequence is monotonically increasing and bounded above, then it converges to its supremum: .

  2. Bounded and Monotonically Decreasing: If a sequence is monotonically decreasing and bounded below, then it converges to its infimum: .

This theorem is remarkable because it connects the order properties (monotonicity) and boundedness of a sequence directly to its convergence. It assures us that a monotonic sequence that is “trapped” (bounded) must settle down to a limit.

Proof of the Weierstrass Monotone Convergence Theorem (for Increasing Sequences)

Let be a monotonically increasing sequence that is bounded above. Let be the set of terms of the sequence. Since is bounded above, the set is also bounded above. By the completeness axiom of the real numbers, the supremum of , denoted , exists and is a real number. We will show that .

Let be given. By the characterization of the supremum, since , there exists an element (for some ) such that . Since is monotonically increasing, for all , we have . Also, since is the supremum of , for all . Combining these inequalities, for all :

This implies that for all , . Thus, by definition, .

The proof for monotonically decreasing sequences is analogous, using the infimum instead of the supremum and the lower bound property.

Example: The Limit of for - Why Exponential Decay Wins

Let’s explore the limit of sequences of the form as , where is an integer and . We want to understand why, for , the exponential term “dominates” the polynomial term , forcing the limit to be zero.

Intuitively, when , raising to higher and higher powers makes it approach zero very rapidly. On the other hand, while (for positive ) grows as increases, the decay of is much stronger, eventually overpowering the polynomial growth.

We will prove that for any integer and , . We’ll consider two cases:

Case 1: Non-Positive Integer Exponent ()

If , then we can rewrite as , where . Our sequence becomes .

  • Subcase 1.1: : In this case, , and . We know that for , . This is a standard limit we can assume we’ve already established (or can prove using the Weierstrass Theorem for decreasing sequences, as is decreasing and bounded below by 0).

  • Subcase 1.2: : Here, . We have . Since and , we know that . Therefore, for :

    We have bounded the absolute value of our sequence by . We already know (since , so ). By the Squeeze Theorem (or more precisely, the Comparison Theorem for limits - since we know and , it follows that , which implies ), we conclude that when .

Case 2: Positive Integer Exponent ()

Now, consider the case where is a positive integer. Let . It’s not immediately obvious how to use monotonicity or boundedness directly. Instead, we’ll use a clever trick involving comparison to a geometric sequence.

Let’s look at the ratio of consecutive terms: . This ratio will tell us how quickly the terms are changing as increases:

As , we know that , so . By the Limit Laws, . Therefore, .

Since , this limit of the ratio is less than 1. This means that for sufficiently large , the terms are becoming smaller relative to the previous term.

Key Insight: Because the ratio approaches , the sequence behaves like a geometric sequence with a ratio less than 1, which we know converges to zero.

Let’s make this rigorous. Since , we can choose a number such that (for example, ). Because the limit of the ratio is less than , for sufficiently large , the ratio itself must be less than . Formally, there exists an integer such that for all ,

Since , and for , (as ), we have for all . Therefore, . Let’s examine the terms starting from index :

  • … and so on.

In general, for any , we have . Let’s rewrite this by setting , so . Then for , we have:

where is a constant that depends on and , but crucially, it does not depend on .

So, for , we have . Since , we know . Therefore, . Again, by the Squeeze Theorem (or Comparison Theorem), since and , we conclude that .

Conclusion: In both cases ( and ), we have shown that for any integer and . This demonstrates the powerful effect of exponential decay compared to polynomial growth in the limit.

Example: The Limit of is 1

Let’s examine the sequence for . We want to show that .

Therefore, .

Example: Euler’s Number (e)

Consider the sequence defined by . This sequence is fundamental in mathematics and converges to a special irrational number denoted by , Euler’s number, approximately .

Statement: The sequence converges. Its limit is denoted by :

Euler’s number appears in numerous areas of mathematics, science, and engineering. It is the base of the natural logarithm, arises in calculus, probability, compound interest, and many other contexts.

Bernoulli’s Inequality: A Useful Tool for Inequalities

Bernoulli’s Inequality: For any integer and any real number , the following inequality holds:

Proof by Induction

  1. Base Case (n=0): For , and . So, , which is true.

  2. Inductive Hypothesis: Assume Bernoulli’s Inequality holds for some , i.e., .

  3. Inductive Step (n=k+1): We want to show that . Since , we have . Multiplying both sides of the inductive hypothesis by (which is positive, so the inequality direction is preserved):

    Expanding the right side:

    Since (as and ), we have:

    Therefore, combining the inequalities:

    Thus, . By the principle of mathematical induction, Bernoulli’s Inequality holds for all and .

Visual Intuition:

Consider the function and its tangent line at . At , and , so . However, this approach is slightly more complex than necessary for a basic visual intuition.

Bernoulli’s Inequality is a useful tool for proving other inequalities and in various areas of analysis, especially when dealing with expressions of the form .

Limit Superior and Limit Inferior

While some sequences converge to a single, well-defined limit, others might oscillate or exhibit more complex behavior. Limit superior and limit inferior provide a way to characterize the “ultimate upper bound” and “ultimate lower bound” of a sequence, even if it doesn’t converge in the traditional sense. These concepts are crucial for understanding the long-term behavior of sequences, particularly when convergence is not guaranteed.

Definitions: Using Supremum and Infimum to Define Ultimate Bounds

Let be a sequence of real numbers. We define the limit superior and limit inferior using the supremum and infimum of “tail” sets of the sequence.

For each , consider the “tail set” of the sequence starting from index :

Now, define two new sequences, and , where:

  • (Supremum of the tail set )
  • (Infimum of the tail set )

Definition of Limit Superior and Limit Inferior:

  • Limit Superior (limsup): The limit superior of the sequence , denoted or , is defined as the limit of the sequence :

  • Limit Inferior (liminf): The limit inferior of the sequence , denoted or , is defined as the limit of the sequence :

Intuitive Understanding: “Ultimate” Upper and Lower Bounds

Imagine observing the values of a sequence as gets larger.

  • Limit Superior as the “Largest Limit Point”: Think of as the largest value that the sequence “approaches” in some sense. It’s the value that the sequence gets arbitrarily close to, infinitely often, and is the largest such value. It represents the highest possible “accumulation point” of the sequence.

  • Limit Inferior as the “Smallest Limit Point”: Similarly, is the smallest value that the sequence “approaches” infinitely often. It is the lowest “accumulation point”.

Consider an oscillating sequence like . The tail sets are always . For all , and . Thus, and for all . Therefore, and . These are indeed the “largest” and “smallest” values the sequence repeatedly gets close to.

Existence of Limit Superior and Limit Inferior

Theorem: Existence of Limit Superior and Limit Inferior for Bounded Sequences

For any bounded sequence , both the limit superior and limit inferior exist as real numbers.

Justification (Sketch):

  1. Boundedness of Tail Sets: If the sequence is bounded, then each tail set is also bounded. Therefore, for each , and are well-defined real numbers.

  2. Monotonicity of and : As increases, the tail set becomes smaller (in the sense that ).

    • Since , . Thus, the sequence is monotonically decreasing (non-increasing).

    • Similarly, since , . Thus, the sequence is monotonically increasing (non-decreasing).

  3. Boundedness of and : Since is bounded, say for all . Then for any tail set , for all . This implies that is a lower bound for and is an upper bound for . Therefore, . So, both and are bounded.

  4. Convergence by Weierstrass Theorem: Since is monotonically decreasing and bounded below, by the Weierstrass Monotone Convergence Theorem, it converges to a limit. This limit is . Similarly, since is monotonically increasing and bounded above, it converges to a limit, which is .

Therefore, for any bounded sequence, both the limit superior and limit inferior always exist as finite real numbers.