2.1 The Limit of a Sequence: Getting Closer and Closer

Imagine an infinite list of numbers, one after another: . This is a sequence. We are interested in what happens to these numbers as we go further and further down the list – as we approach “infinity”. Do they settle down to a particular value? Do they shoot off to infinity themselves? Or do they just bounce around erratically?

We say a sequence converges to a limit if the terms get arbitrarily close to as gets larger and larger. “Arbitrarily close” is the key idea here. No matter how small a distance you pick, eventually the terms of the sequence will be within that distance of the limit.

Here’s the formal definition, which makes “arbitrarily close” precise using and :

Think of as your “tolerance” for closeness. If you give me any tolerance , I can find a point in the sequence such that everything after that point is within your tolerance of the limit .

Example: Consider the sequence . It looks like these numbers are getting closer and closer to 1. Let’s prove that .

Uniqueness and Boundedness of Limits

A sequence can only converge to one limit. It can’t approach two different values at the same time.

If a sequence converges, it must also be bounded. It can’t go off to infinity or become arbitrarily large.

Boundedness is a necessary condition for convergence, but it’s not sufficient. A bounded sequence doesn’t have to converge (think of the sequence which is bounded but diverges). Convergence is a stronger condition.

2.2 Divergence: When Sequences Go Their Own Way

Not all sequences settle down to a finite limit. Some sequences diverge. Divergence simply means a sequence does not converge to a real number. There are different ways a sequence can diverge. It might oscillate, grow without bound, or behave in some other non-convergent way.

One particular type of divergence is divergence to infinity. This is when the terms of a sequence become larger and larger, exceeding any bound you can set.

Example: Consider the sequence . This sequence grows rapidly. Let’s show that .

2.3 Limit Theorems: The Algebra of Convergence

Working directly with the - definition can be cumbersome. Fortunately, there are powerful theorems that let us compute limits more easily by manipulating convergent sequences algebraically. These are the Limit Laws.

These laws are incredibly useful. They tell us that limits behave nicely with basic arithmetic operations. We can find the limit of a sum by summing the limits, the limit of a product by multiplying the limits, and so on (with some conditions, like for division).

Example: Let’s find .

We know . Using the Product Law (Theorem 2.1.8 (2)), we can write:

More generally, for any integer , we can use repeated application of the product and quotient laws to show:

These Limit Laws are essential tools in analysis, allowing us to break down complex limits into simpler pieces and compute them systematically. They form the basis for much of what comes next, including the study of series.

2.4 Monotone Sequences: Always Going in One Direction

A monotone sequence is one that is always either increasing or always decreasing. They are simpler to analyze than sequences that jump around.

Think of a sequence walking uphill (increasing) or downhill (decreasing) – it always moves in one direction.

2.5 The Monotone Convergence Theorem (Weierstrass Theorem): Bounded and Monotone = Convergent

A fundamental result in analysis is the Monotone Convergence Theorem, also known as the Weierstrass Theorem. It provides a powerful tool for proving convergence of sequences, especially when you can show a sequence is both monotone and bounded.

In simpler terms: If a sequence is going uphill and has a ceiling, it must approach that ceiling (or something below it). If it’s going downhill and has a floor, it must approach that floor (or something above it).

Example: Consider the sequence . We can show (though it’s not immediately obvious) that this sequence is monotonically increasing and bounded above. Therefore, by the Weierstrass Theorem, it must converge. Its limit is the famous number .

Example: Let’s look at another sequence defined recursively: , . We can use the Weierstrass Theorem to show this converges.

2.6 The Cauchy Criterion: Convergence Without Knowing the Limit

The definition of convergence requires knowing the limit . But what if we don’t know the limit in advance? The Cauchy Criterion provides a way to determine if a sequence converges without knowing its limit. It’s based on the idea that the terms of a convergent sequence must eventually become close to each other.

A Cauchy sequence is one where the terms eventually get arbitrarily close to each other, regardless of what the limit might be.

The Cauchy Criterion is a powerful theoretical tool. It’s used to prove many important results in analysis, and it provides a way to rigorously define convergence in more abstract settings where a limit might not be readily apparent.

This concludes our exploration of sequences, convergence, divergence, and the fundamental theorems that govern their behavior. These concepts are the bedrock for understanding series, continuity, differentiation, and integration, which we will explore in the following chapters.

2.7 Series: Adding Up Infinitely Many Terms

Imagine taking a sequence and adding up all its terms: . This infinite sum is called a series. Does this sum make sense? Can we actually add up infinitely many numbers and get a finite result? Sometimes yes, sometimes no.

To make sense of an infinite sum, we look at partial sums. The -th partial sum is the sum of the first terms:

A series converges if the sequence of its partial sums converges to a limit.

Example: The Geometric Series

Consider the geometric series . This series is fundamental in mathematics. When does it converge? And if it converges, what’s its sum?

Example: The Harmonic Series

The harmonic series is . Does this series converge or diverge? It turns out it diverges, even though the terms go to zero. The terms don’t go to zero fast enough for the sum to be finite.

2.8 Properties of Convergent Series: Linearity

Like limits of sequences, convergent series behave nicely with addition and scalar multiplication (linearity).

These laws allow us to manipulate convergent series in familiar algebraic ways.

2.9 Convergence Tests: Determining if a Series Converges

We need tools to determine if a series converges without having to find its sum explicitly. The Cauchy Criterion for series is one such tool.

This criterion says a series converges if and only if the “tails” of the series (the sums of terms far down the sequence) become arbitrarily small.

Comparison Test: If we have a series that is “smaller” than a known convergent series, it must also converge. Conversely, if it’s “larger” than a known divergent series, it must also diverge.

Absolute Convergence Test: If a series of absolute values converges, then the original series must also converge. This is a powerful test because it allows us to use comparison tests (which require non-negative terms) on series with both positive and negative terms.

Ratio Test and Root Test: These are practical tests that use ratios or roots of terms to determine convergence.

These tests provide practical methods for checking convergence of many series, especially power series and series where the terms have factorial or exponential components.

This is just the beginning of our journey into series. There’s much more to explore, including alternating series, power series, and the fascinating world of uniform convergence and its implications for calculus operations on infinite sums.

2.10 Alternating Series: A Dance of Plus and Minus

An alternating series is a series where the terms alternate in sign, like:

where for all . These series can converge even when the series of absolute values diverges (i.e., they can be conditionally convergent). The Alternating Series Test (Leibniz Criterion) gives us a simple condition for their convergence.

The condition that decreases to zero is crucial. It ensures that the positive and negative terms “cancel out” enough for the partial sums to converge.

Example: The Alternating Harmonic Series

The alternating harmonic series is . Let’s check if it converges using the Leibniz Criterion.

2.11 Rearrangement of Series: Order Matters (Sometimes!)

For finite sums, the order in which you add terms doesn’t matter. But for infinite series, things are more delicate. Rearranging a series means changing the order of its terms.

Absolutely Convergent Series: Order Doesn’t Matter

If a series is absolutely convergent (i.e., converges), then rearranging the terms doesn’t change the sum. The sum is robust to reordering.

Conditionally Convergent Series: Order Matters Greatly!

However, if a series is only conditionally convergent (converges, but not absolutely), then rearranging the terms can change the sum, or even make the series diverge! This is a surprising and counterintuitive result.

This theorem highlights the delicate nature of conditional convergence. The sum of a conditionally convergent series is not an inherent property of the terms themselves, but depends crucially on the order in which they are added.

Example: Rearranging the Alternating Harmonic Series

We saw the alternating harmonic series converges to . But we can rearrange it to converge to a different value, or even diverge! The script provides an example where rearranging it makes it converge to .

2.12 The Cauchy Product of Series: Multiplying Infinite Sums

How do we multiply two infinite series? If we have and , what is their product? The Cauchy product provides a way to define this product.

This formula arises naturally if you think about multiplying the partial sums and collecting terms with the same total index.

The Cauchy product works well for absolutely convergent series. However, for conditionally convergent series, the Cauchy product may not converge, or if it does, its sum might not be what you expect.

Example: Cauchy Product and the Exponential Function

The exponential function can be defined by its power series: . Let’s use the Cauchy product to multiply and .

This concludes Chapter 2. We’ve explored sequences and series, their convergence, divergence, and important operations and tests. We’ve seen the power of the Weierstrass Theorem, the Cauchy Criterion, and the utility of Limit Laws. We’ve also touched upon the subtleties of rearranging series and multiplying infinite sums. This foundation will be crucial as we move into the realm of continuous functions and calculus in the following chapters.

Previous Chapter: Chapter 1 - Real Numbers, Euclidean Spaces, and Complex Numbers

Next Chapter: Chapter 3 - Continuous Functions, Smoothness and Limits, Continuity, Functions Without Jumps, Combining Continuous Functions, Fundamental Theorems of Continuity, Exponential and Trigonometric Functions