2.1 The Limit of a Sequence: Getting Closer and Closer
Imagine an infinite list of numbers, one after another: . This is a sequence. We are interested in what happens to these numbers as we go further and further down the list – as we approach “infinity”. Do they settle down to a particular value? Do they shoot off to infinity themselves? Or do they just bounce around erratically?
We say a sequence converges to a limit if the terms get arbitrarily close to as gets larger and larger. “Arbitrarily close” is the key idea here. No matter how small a distance you pick, eventually the terms of the sequence will be within that distance of the limit.
Here’s the formal definition, which makes “arbitrarily close” precise using and :
Definition 2.1.1: Convergence of a Sequence
A sequence of real numbers converges to a limit if for every (no matter how small), there exists a natural number (which might be very large) such that for all , the absolute difference between and is less than :
If such an exists, we write .
Think of as your “tolerance” for closeness. If you give me any tolerance , I can find a point in the sequence such that everything after that point is within your tolerance of the limit .
Example: Consider the sequence . It looks like these numbers are getting closer and closer to 1. Let’s prove that .
Example: Proof of Convergence for
We need to show that for any , we can find an such that for all , .
First, simplify the expression:
We want . This is equivalent to , or .
By the Archimedean Principle (Corollary 1.1.7), for any , we can find a natural number such that .
Then, for all , we have , so , and thus .
Therefore, for any , we have found an such that for all , . This proves that .
Uniqueness and Boundedness of Limits
A sequence can only converge to one limit. It can’t approach two different values at the same time.
Lemma 2.1.3: Uniqueness of Limits
The limit of a convergent sequence is unique.
Proof of Uniqueness of Limits (by contradiction)
Suppose a sequence converges to two different limits, and , with . Let .
Since , there exists such that for all , . Since , there exists such that for all , .
Let . Then for , both inequalities hold. Consider the distance between and :
This implies . If , we can divide both sides by to get , which is a contradiction. Therefore, our initial assumption that must be false, and the limit is unique.
If a sequence converges, it must also be bounded. It can’t go off to infinity or become arbitrarily large.
Remark 2.1.5: Boundedness of Convergent Sequences
Every convergent sequence is bounded.
Proof of Boundedness
Let be a convergent sequence with limit . Choose . By definition of convergence, there exists such that for all , . This implies that for , .
So, for , the terms are bounded by . For the first terms (), they are also finite values. Therefore, we can find an overall bound by taking the maximum of the absolute values of the first terms and .
Let . Then for all , which means the sequence is bounded.
Boundedness is a necessary condition for convergence, but it’s not sufficient. A bounded sequence doesn’t have to converge (think of the sequence which is bounded but diverges). Convergence is a stronger condition.
2.2 Divergence: When Sequences Go Their Own Way
Not all sequences settle down to a finite limit. Some sequences diverge. Divergence simply means a sequence does not converge to a real number. There are different ways a sequence can diverge. It might oscillate, grow without bound, or behave in some other non-convergent way.
One particular type of divergence is divergence to infinity. This is when the terms of a sequence become larger and larger, exceeding any bound you can set.
Definition: Convergence to Infinity
A sequence converges to infinity, written , if for every real number , there exists a natural number such that for all , .
Similarly, a sequence converges to negative infinity, written , if for every real number , there exists a natural number such that for all , .
Example: Consider the sequence . This sequence grows rapidly. Let’s show that .
Example: Proof of Divergence to Infinity for
We need to show that for any real number , we can find an such that for all , .
First, we use induction to prove that for all .
- Base Case (n=1): . True.
- Inductive Step: Assume for some . We want to show . . Since for , we have .
Now, given any real number , by the Archimedean Principle, there exists an integer such that .
For all , we have , so . Since , we have .
Thus, for any , we found an such that for all , . This proves .
2.3 Limit Theorems: The Algebra of Convergence
Working directly with the - definition can be cumbersome. Fortunately, there are powerful theorems that let us compute limits more easily by manipulating convergent sequences algebraically. These are the Limit Laws.
Theorem: Algebraic Limit Laws
Let and be convergent sequences with limits and . Then:
- Sum Law:
- Product Law:
- Quotient Law: If and for all , then
- Comparison Law: If there exists such that for all , then .
These laws are incredibly useful. They tell us that limits behave nicely with basic arithmetic operations. We can find the limit of a sum by summing the limits, the limit of a product by multiplying the limits, and so on (with some conditions, like for division).
Proof of Sum Law (Theorem 2.1.8 (1))
Let . We want to show that there exists such that for all , .
Since , there exists such that for all , . Since , there exists such that for all , .
Let . Then for all , both inequalities hold. Consider . By the triangle inequality, . For , we have .
Thus, for all , , which proves .
Example: Let’s find .
We know . Using the Product Law (Theorem 2.1.8 (2)), we can write:
More generally, for any integer , we can use repeated application of the product and quotient laws to show:
Example: Limit of
For any integer , .
These Limit Laws are essential tools in analysis, allowing us to break down complex limits into simpler pieces and compute them systematically. They form the basis for much of what comes next, including the study of series.
2.4 Monotone Sequences: Always Going in One Direction
A monotone sequence is one that is always either increasing or always decreasing. They are simpler to analyze than sequences that jump around.
Definition 2.2.1: Monotone Sequences
- A sequence is monotonically increasing (or non-decreasing) if for all .
- A sequence is monotonically decreasing (or non-increasing) if for all .
- A sequence is monotone if it is either monotonically increasing or monotonically decreasing.
Think of a sequence walking uphill (increasing) or downhill (decreasing) – it always moves in one direction.
2.5 The Monotone Convergence Theorem (Weierstrass Theorem): Bounded and Monotone = Convergent
A fundamental result in analysis is the Monotone Convergence Theorem, also known as the Weierstrass Theorem. It provides a powerful tool for proving convergence of sequences, especially when you can show a sequence is both monotone and bounded.
Theorem 2.2.2: Monotone Convergence Theorem (Weierstrass)
- If a sequence is monotonically increasing and bounded above, then it converges to a limit . The limit is the supremum (least upper bound) of the set of terms in the sequence.
- If a sequence is monotonically decreasing and bounded below, then it converges to a limit . The limit is the infimum (greatest lower bound) of the set of terms in the sequence.
In simpler terms: If a sequence is going uphill and has a ceiling, it must approach that ceiling (or something below it). If it’s going downhill and has a floor, it must approach that floor (or something above it).
Proof of Weierstrass Theorem (Part 1: Increasing and Bounded Above)
Let be the supremum of the set of terms (which exists because the sequence is bounded above and non-empty). Let .
Since is the least upper bound, is not an upper bound. Therefore, there must be at least one term in the sequence, say , such that .
Since is monotonically increasing, for all , we have . Also, since is an upper bound for all terms, for all .
Combining these inequalities, for all , we have .
Thus, for all , . This proves that .
Example: Consider the sequence . We can show (though it’s not immediately obvious) that this sequence is monotonically increasing and bounded above. Therefore, by the Weierstrass Theorem, it must converge. Its limit is the famous number .
Example: Convergence of
The sequence is monotonically increasing and bounded above (though proving these facts requires some work, often using the Binomial Theorem or calculus). Therefore, by the Weierstrass Theorem, it converges. The limit is defined as Euler’s number:
Example: Let’s look at another sequence defined recursively: , . We can use the Weierstrass Theorem to show this converges.
Example: Recursive Sequence
Define , . We want to show exists.
1. Boundedness (above by 2): We use induction to show for all .
- Base Case (n=1): .
- Inductive Step: Assume . Then .
2. Monotonically Increasing: We use induction to show for all .
- Base Case (n=1): , . So .
- Inductive Step: Assume . Then . Taking square roots (which preserves order for non-negative numbers): , which means .
Since is monotonically increasing and bounded above, by the Weierstrass Theorem, it converges to a limit, say .
3. Finding the Limit: Since , taking the limit as on both sides: . Squaring both sides: , so . Factoring: . Possible solutions are or . Since for all , the limit must be non-negative, so .
Therefore, .
2.6 The Cauchy Criterion: Convergence Without Knowing the Limit
The definition of convergence requires knowing the limit . But what if we don’t know the limit in advance? The Cauchy Criterion provides a way to determine if a sequence converges without knowing its limit. It’s based on the idea that the terms of a convergent sequence must eventually become close to each other.
Theorem 2.4.2: Cauchy Criterion for Convergence
A sequence converges if and only if it is a Cauchy sequence. A sequence is Cauchy if for every , there exists an such that for all , .
A Cauchy sequence is one where the terms eventually get arbitrarily close to each other, regardless of what the limit might be.
Proof of Cauchy Criterion (Sketch)
() If a sequence converges, it’s Cauchy: Assume converges to a limit . Then for large and , both and are close to . By triangle inequality, and must be close to each other.
() If a sequence is Cauchy, it converges: (This direction is more involved and relies on the completeness of real numbers).
- Cauchy sequences are bounded: (similar to proof that convergent sequences are bounded).
- Use completeness to find a candidate limit: Define and . Show that is increasing and bounded above, and is decreasing and bounded below. Both converge to limits, say and .
- Show : Use the Cauchy condition to show that can be made arbitrarily small, forcing . Let .
- Show : Use the fact that and the Cauchy property to show that converges to .
The Cauchy Criterion is a powerful theoretical tool. It’s used to prove many important results in analysis, and it provides a way to rigorously define convergence in more abstract settings where a limit might not be readily apparent.
This concludes our exploration of sequences, convergence, divergence, and the fundamental theorems that govern their behavior. These concepts are the bedrock for understanding series, continuity, differentiation, and integration, which we will explore in the following chapters.
2.7 Series: Adding Up Infinitely Many Terms
Imagine taking a sequence and adding up all its terms: . This infinite sum is called a series. Does this sum make sense? Can we actually add up infinitely many numbers and get a finite result? Sometimes yes, sometimes no.
To make sense of an infinite sum, we look at partial sums. The -th partial sum is the sum of the first terms:
A series converges if the sequence of its partial sums converges to a limit.
Definition 2.7.1: Convergence of a Series
The infinite series converges if the sequence of partial sums , where , converges to a limit.
If , we write , and is called the sum of the series. If the sequence of partial sums diverges, the series diverges.
Example: The Geometric Series
Consider the geometric series . This series is fundamental in mathematics. When does it converge? And if it converges, what’s its sum?
Example: Convergence of Geometric Series
Let . Multiply by : . Subtract from : . So, . If , we can solve for : .
Now consider the limit as . If , then . Thus, if , .
If , the sequence does not converge to 0 (unless , in which case diverges to ). Thus, if , the geometric series diverges (unless where it clearly diverges).
Conclusion: The geometric series converges if and only if , and in that case, its sum is .
Example: The Harmonic Series
The harmonic series is . Does this series converge or diverge? It turns out it diverges, even though the terms go to zero. The terms don’t go to zero fast enough for the sum to be finite.
Example: Divergence of Harmonic Series
The harmonic series diverges. (Proof by grouping terms, similar to Example 1.1.17(ii) in the script, showing the partial sums grow without bound).
2.8 Properties of Convergent Series: Linearity
Like limits of sequences, convergent series behave nicely with addition and scalar multiplication (linearity).
Theorem 2.7.4: Linearity of Series
Let and be convergent series with sums and , and let (or ). Then:
- Sum Law: The series converges and .
- Scalar Multiplication Law: The series converges and .
These laws allow us to manipulate convergent series in familiar algebraic ways.
2.9 Convergence Tests: Determining if a Series Converges
We need tools to determine if a series converges without having to find its sum explicitly. The Cauchy Criterion for series is one such tool.
Theorem 2.7.5: Cauchy Criterion for Series
The series converges if and only if for every , there exists such that for all , .
This criterion says a series converges if and only if the “tails” of the series (the sums of terms far down the sequence) become arbitrarily small.
Comparison Test: If we have a series that is “smaller” than a known convergent series, it must also converge. Conversely, if it’s “larger” than a known divergent series, it must also diverge.
Corollary 2.7.7: Comparison Test
Let and be series with non-negative terms ().
- If converges and there exists such that for all , , then converges.
- If diverges and there exists such that for all , , then diverges.
Absolute Convergence Test: If a series of absolute values converges, then the original series must also converge. This is a powerful test because it allows us to use comparison tests (which require non-negative terms) on series with both positive and negative terms.
Theorem 2.7.10: Absolute Convergence Test
If the series converges (we say is absolutely convergent), then the series also converges. Furthermore, .
Ratio Test and Root Test: These are practical tests that use ratios or roots of terms to determine convergence.
Theorem 2.7.17: Ratio Test
Let be a series with for all .
- If , then the series converges absolutely.
- If , then the series diverges.
Theorem 2.7.20: Root Test
Let be a series.
- If , then the series converges absolutely.
- If , then the series diverges.
These tests provide practical methods for checking convergence of many series, especially power series and series where the terms have factorial or exponential components.
This is just the beginning of our journey into series. There’s much more to explore, including alternating series, power series, and the fascinating world of uniform convergence and its implications for calculus operations on infinite sums.
2.10 Alternating Series: A Dance of Plus and Minus
An alternating series is a series where the terms alternate in sign, like:
where for all . These series can converge even when the series of absolute values diverges (i.e., they can be conditionally convergent). The Alternating Series Test (Leibniz Criterion) gives us a simple condition for their convergence.
Theorem 2.7.12: Alternating Series Test (Leibniz Criterion)
Let be a sequence of positive terms () that is monotonically decreasing () and converges to zero (). Then the alternating series converges. Furthermore, the sum of the series satisfies the bounds:
The condition that decreases to zero is crucial. It ensures that the positive and negative terms “cancel out” enough for the partial sums to converge.
Proof of Alternating Series Test (Sketch)
Let be the partial sums.
- Even Partial Sums are increasing and bounded above: since is decreasing. Also, since .
- Odd Partial Sums are decreasing and bounded below: . Also, since .
- Both even and odd subsequences converge: By the Monotone Convergence Theorem, both and converge to some limits, say and .
- Limits are equal: as (since ). Thus , so .
- Full sequence converges: Since both even and odd subsequences converge to the same limit , the entire sequence converges to .
- Error bounds: From the monotone behavior, we can deduce , which gives the error bounds .
Example: The Alternating Harmonic Series
The alternating harmonic series is . Let’s check if it converges using the Leibniz Criterion.
Example: Convergence of Alternating Harmonic Series
Let .
- for all . (Positive terms)
- . (Monotonically decreasing)
- . (Limit goes to zero)
All conditions of the Leibniz Criterion are satisfied. Therefore, the alternating harmonic series converges.
Its sum is actually , which lies between and , as predicted by the error bounds.
2.11 Rearrangement of Series: Order Matters (Sometimes!)
For finite sums, the order in which you add terms doesn’t matter. But for infinite series, things are more delicate. Rearranging a series means changing the order of its terms.
Absolutely Convergent Series: Order Doesn’t Matter
If a series is absolutely convergent (i.e., converges), then rearranging the terms doesn’t change the sum. The sum is robust to reordering.
Theorem 2.7.16: Dirichlet's Theorem (Rearrangement of Absolutely Convergent Series)
If a series is absolutely convergent, then any rearrangement of the series converges to the same sum.
Proof of Dirichlet's Theorem (Sketch)
- Use the Cauchy Criterion for absolute convergence: For any , the “tail” of the absolute series sum is small.
- Consider a rearrangement . Let be the -th partial sum of the rearranged series and be the -th partial sum of the original series.
- For large enough , the terms in will include all the terms in for some large . The difference will consist of terms from the “tail” of the original series (terms with index greater than ).
- Because of absolute convergence, this “tail” sum can be made smaller than . Thus, is close to , which is close to the sum of the original series. This shows the rearranged series converges to the same sum.
Conditionally Convergent Series: Order Matters Greatly!
However, if a series is only conditionally convergent (converges, but not absolutely), then rearranging the terms can change the sum, or even make the series diverge! This is a surprising and counterintuitive result.
Theorem: Riemann Rearrangement Theorem
If a series is conditionally convergent, then for any real number , there exists a rearrangement of the series that converges to . Furthermore, there exist rearrangements that diverge to or .
This theorem highlights the delicate nature of conditional convergence. The sum of a conditionally convergent series is not an inherent property of the terms themselves, but depends crucially on the order in which they are added.
Example: Rearranging the Alternating Harmonic Series
We saw the alternating harmonic series converges to . But we can rearrange it to converge to a different value, or even diverge! The script provides an example where rearranging it makes it converge to .
2.12 The Cauchy Product of Series: Multiplying Infinite Sums
How do we multiply two infinite series? If we have and , what is their product? The Cauchy product provides a way to define this product.
Definition 2.7.24: Cauchy Product of Series
The Cauchy product of two series and is the series , where the -th term is given by:
This formula arises naturally if you think about multiplying the partial sums and collecting terms with the same total index.
Theorem 2.7.23: Convergence of Cauchy Product
If both series and are absolutely convergent, then their Cauchy product is also absolutely convergent, and its sum is the product of the sums of the original series:
The Cauchy product works well for absolutely convergent series. However, for conditionally convergent series, the Cauchy product may not converge, or if it does, its sum might not be what you expect.
Example: Cauchy Product and the Exponential Function
The exponential function can be defined by its power series: . Let’s use the Cauchy product to multiply and .
Example: Cauchy Product of Exponential Series
Let and . The -th term of their Cauchy product is:
By the Binomial Theorem, .
So, . Thus, the Cauchy product series is , which is the power series for .
This shows that , which is a fundamental property of the exponential function, derived here using the Cauchy product of their series representations.
This concludes Chapter 2. We’ve explored sequences and series, their convergence, divergence, and important operations and tests. We’ve seen the power of the Weierstrass Theorem, the Cauchy Criterion, and the utility of Limit Laws. We’ve also touched upon the subtleties of rearranging series and multiplying infinite sums. This foundation will be crucial as we move into the realm of continuous functions and calculus in the following chapters.
Previous Chapter: Chapter 1 - Real Numbers, Euclidean Spaces, and Complex Numbers